UX DESIGN   |   SERVICE DESIGN

Finding an accessible and efficient solution to report health conditions

The project

Applicants of this benefit were limited to applying through a 30-page paper form that took hours, and even days to complete. The experience was even more broken as users had to look for evidence to accurately report information and heavily relied on family members to help them during the process.

We were asked to bring the benefit’s paper form to digital and improve usability, accessibility and accuracy of information.

MY ROLE

I led the design effort to digitise the service and make the experience easier, accessible and more streamlined.

TEAM

UX designer · Delivery Manager · Product Owner · Business Analysts · User Researcher · Content Designer · Performance Analyst · Front-end developers · Back-end developer · Quality Assurance · Policy Expert · Subject Matter Experts

Key challenges

🔎 Lack of awareness

Awareness of the benefit is low amongst eligible claimants.

⏱️ Demanding application process

The paper form is long and complicated, and many claimants need help from their family, friends, and third parties.

💭 Misinterpretation of terms

Ambiguity of key terms and definitions was leading to misinterpretation by customers resulting in inaccurate reporting of their care needs.

💰 Costly decision-making

Decision makers do a lot of manual checks and processes leading to high staff costs and slow processing times.

↔️ Inconsistent decisions

Award decisions are subjective and there is a lack of clarity in policy and guidance, leading to inconsistency of decision-making.

The impact

- 70% in the completion time

Of the claim application (from 2-3 hours to 45 min on average).

Net Promoter Score = 47

- 37% in rate of clarification calls

initially 55% at beta launch, to 18% at the end of the phase, surpassing the paper application benchmark of 25%.

Reporting health conditions: problem statement

Our users suffer from an average of 5 long-term conditions. In the paper form, this is hard as they not only have to list all their conditions but face confusion when linking medication to illnesses and need time to find the dosages. This is causing frustration and fatigue but also leading to users spending a lot of time in this section.

From this

This is the paper form that users had to fill in. As you can see in the quotes, it was quite challenging for users as they had to remember the conditions they suffered from, spell them correctly, guess how long they had the health condition, and list any medicines they had for each condition.

“The first thing I thought was “Oh
God, where am I going to get this information from?””.

- Floyd, 75

“This is where I spent most of the time in, looking for medicines and stuff.”

- Agnes, 69

What we did to improve the design and iterate further

Benchmark of health condition flows across other services

➡️

To understand what worked or didn’t off the back of testing for each service and make more informed decisions on what to develop and test.

Focus groups with decision-makers, third-party charities and doctors

➡️

We challenged questions asked in the paper version and were able to remove unnecessary questions which didn’t make a difference in decision-making

User testing with claimants themselves and/or family members helping them apply

➡️

We tested for usability, accessibility and comprehension. We gathered pain points (shown below) and iterated the designs based on these.

Monitor data analytics

We used these insights to understand user behaviour further and spot new patterns.

➡️

To running experiments

Hypothesis concept 1: By displaying the most reported conditions, users will be able to find and declare their conditions in less time.

Hypothesis concept 2: By giving users points and examples of what information we require, users will provide the level of detail requested.

Hypothesis concept 3: By using the type-ahead component, users will be familiar with the pattern and will be able to find their condition more easily and accurately.

EXPERIMENTS OUTCOME

After comparing results, the decision was to move forward with concept 3, as time-on-task was reduced significantly and 9 out of 10 users successfully completed the task.

DEVELOPING CONCEPT 3

Improving accuracy
of information

After we proved users understood how to use the typeahead component, we introduced a page in the flow where users had to input an approximate date. This created a looping flow that users now had to deal with.

Testing showed there were still benefits to this, but users were also experiencing pain points in key interactions.

✅ Users got suggestions even if they misspelt their condition.

😕 Most users did not see the manual link below the search bar when they couldn’t find their condition, resulting in confusion and frustration.

Time on task was significantly reduced, and the accuracy of information improved.

😕 Some users had difficulty understanding the looping pattern, not linking the condition they had just inputted to the start date.

DEVELOPING CONCEPT 3

Reducing friction in search

We analyzed analytics data and UR insights to further understand the user behaviour with the previous pattern.

We not only spotted users not seeing the manual link but also added friction if they used this link to add their condition. This led us to propose the async creatable pattern, to allow users to add conditions not listed more seamlessly.

DEVELOPING  CONCEPT 3

Making the looping pattern more intuitive

As mentioned before, some users experienced difficulty linking when they started experiencing difficulties with the conditions they had just listed.

We decided to group the condition and date fields so that these were on the same page. Our hypothesis was that this would make it easier for users to link both together.

To the final solution

✅ Made the looping pattern more intuitive.

✅ Improved usability of the typeahead component.

✅ Enhanced content to guide users better.

Lessons learnt

Qualitative insights are sometimes not enough to understand fully what’s not working. Analysing quantitative data and coupling it with the user testing feedback was what allowed us to understand how users were behaving and what they were expecting to see.

It is very important to bring all key stakeholders on the journey of a re-design, but also to demonstrate the value the new designs could deliver. This was how we ensured that we were designing a service that would be user-friendly, but fit for purpose when it came to decision-making. Demonstrating the value was key at first, and during the process, to get buy-in and be able to co-design with decision makers.

Pushing boundaries and experimenting with new patterns helps improve a design system. This is what I pushed for by using this component and it allowed us to feed back learnings and recommendations to the design system team.

Next
Next

Transforming remote onboarding: 75% faster, healthier, and more effective