Written by Jeffrey Steen
In the complex world of healthcare, it’s critical that individuals have a simple way to find, purchase, and use their health insurance. Stride Health addresses this challenge by offering the world’s only mobile-first health insurance recommendation engine, designed to help individuals find and purchase a health plan.
In order to deliver this experience, Stride’s UX team had to conduct well-designed user testing. In the second segment of our ongoing series on user research, Dan Slate, Stride’s Lead Product Manager, discusses how Validately’s Lean model of user research and intuitive approach allowed for significant improvements to the UX design of Stride’s product.
What motivated you to conduct user research?
As is true for most product teams, user research has been critical to our process of uncovering key customer insights, ultimately leading us to product improvements.
Why did you choose Validately to help facilitate that research?
Conducting user research on the process of shopping for and purchasing health insurance requires that we put users in the right context to provide relevant feedback. Validately’s tests enabled us to do this easily while also rapidly experimenting with different designs to determine what was most intuitive.
One of the two testing models you used was a Moderated Discussion. How did you structure that and what were your goals?
The Moderated Discussion allowed us to work through the entire onboarding process for new users. Each person builds a profile based on unique medical and personal data that help us assess their health needs and recommend the best plan for them to minimize their total healthcare expenses.
Requiring users to complete this experience prior to answering questions about the plan recommendation and comparison experience was critical to putting them in the right context. Validately enabled us to do this in our live product so we could observe user interactions in a very organic way. The Moderated Discussion also enabled us to maintain some structure to the test so we could prompt the users with questions at specific points to gather the feedback we needed.
What feedback stood out during these tests?
We consistently heard that our users wanted to know what else we considered before making our recommendation. They also wanted to know what the cheapest plan was as a benchmark for comparison, and if there were similar plans from an insurance company they felt had a better brand. In short, we learned that presenting various insurance options helps our users determine if the plan we recommend is actually the best plan for them.
How did this change the UX design of Stride?
We now show two options alongside our recommended plan. These options reflect our research insight, offering a plan with the lowest premium for cost-conscious shoppers, and one similar to our recommended plan but offered by a different insurance company.
Given these new options, did you encounter new design challenges during testing?
Yes. We heard lots of users say they had difficulty comparing plans. While side-by-side comparison of physical goods is quite easy, it’s hard to do in the digital sphere, particularly on mobile devices and for information-rich products like health insurance.
How did you approach user testing to solve this comparison problem?
We looked at usage data for our product and noticed users were digging deep into a plan’s details, then backing all the way out to select another plan to dig into. Some users would repeat this behavior five times before settling on a final choice. The conclusion was obvious: It was time to take on the design challenge of a mobile-first side-by-side comparison for health plans.
This new feature allowed our users to place two plans side-by-side on their mobile phone and compare the specific benefits of each. We wanted to eliminate the need to wait until you’re back around a computer to do research on your options or read the fine print of each plan.
Part of navigating the health insurance world is understanding complicated terminology. Did users voice this concern during testing?
They did. While we pride ourselves on putting health insurance terms into language that users understand—like translating “deductible” to “the amount you’ll pay out-of-pocket before your insurance company starts to pay for services”—we knew we had to go a step further and get detailed user feedback on how plan information is communicated.
What was your approach?
Using Validately’s Unmoderated Talk-Aloud tests, we assembled plan cards that detailed the benefits of various health insurance plans using different formats and terminology. We paired these with simple questions to get user feedback.
After sourcing testers from Validately’s user panel by targeting demographics most similar to our own user base, we shared the cards with each test participant. We recorded video and audio reactions to each card, noting the moments when users paused or struggled—or showed very clear understanding. This research helped us better understand the information that was successfully and quickly communicated, and what information was hard to parse.
How did these insights influence the product?
Our research led to two key changes. First, we updated the plan summaries with a few additional data points and changed the visual layout to make them easier to scan when stacked in a list. This saves our users time when reviewing different options so that they can figure out which plans are worth exploring. Second, we added explanatory sections to key data points to help educate users in the context of the decision they were trying to make about a plan. For example, we made it clearer that our estimated care costs are a forecast of out-of-pocket expenses on top of monthly premium payments. This number is designed to help our users assess each plan in the context of what they might spend in a normal year, so they can find one that minimizes their costs.
Are there metrics you can point to that show this new development/design has been successful?
The top line metric we watched was the user conversion rate—that is, how many users chose to purchase a plan on Stride before and after redesign. For us, that’s a proxy for how much trust and confidence users have in our recommendations.
We also look at qualitative feedback. Do users feel like they are more educated after using Stride? Do we see a reduction in the common questions we received and sought to address in the redesign? Did people respond with understanding instead of confusion?
Would you say Validately was key to these developments?
Absolutely. Validately achieved two things we hadn’t been able to previously. First, it allowed us to more effectively measure how well our product was building trust and confidence with users. Because Validately allowed us to watch user reactions as they were using our product, we were able to gather valuable qualitative feedback that led to successful redesign. Second, Validately allowed us to frame our tests so that users were placed in the right context—specifically, the annual purchase of health insurance. This meant that the feedback we received was far more actionable.
What prompted you to use Validately for this user research?
A design contractor told us about them, and recommended we give their testing a try. We all agree it’s been very successful, and we’ll continue to use them for research as we hone product design and user experience.
Dan Slate leads Product at Stride Health in San Francisco. Before joining the team at Stride, he spent several years building healthcare and financial services products at Intuit, Atigeo, and Narayana Health.
Leave a Comment
Only registerd members can post a comment , Login / Register