Search Results For jeff steen

Validately’s Weekly UX and User Research Roundup


Profiles in UX Research: Canopy Tax
By Jeffrey Steen on

Screen Shot 2016-05-17 at 2.05.12 PMIn Validately’s latest edition of Profiles in UX Research, we profile Nate Sanders from Canopy Tax and discuss:

  • How tester recruiting and remote moderated testing are integral to Canopy’s product development
  • Why they chose and use Validately
  • Specific Canopy product features that have been improved through moderated testing

Screen Shot 2016-05-17 at 2.15.15 PM

Nate Sanders has been Product Team Lead at Canopy since January 2015. Previously, he worked as a UX designer at BambooHR. A UX designer for most of his career, Sanders is very much in favor of a design philosophy that centers on user experience, instead of clinging to inflexible waterfall development.



Never Show A Design You Haven’t Tested On Users
By Ida Aalen on
User testing is a way to be there when it happens, to make sure the stuff you created actually works as you intended, because best practices and common sense will get you only so far.

Delight Your Users – A Definitive Guide to User Testing
By Brian Bimschleger on
This article reviews what usability testing is, answers your questions and helps you understand how to gather useful and actionable data from your users.

A Simple Checklist for Successful Design Handoffs
By Marcus Castenfors on
Review this checklist to ensure a successful customer experience.

The Principles of UX Design: Chapter 1
By Timothy Embretson on
This is a great article for those who’ve heard about UX and want to know what it is, why it’s so important, and how to become an advocate for it.

Want to learn more about Validately?



Profiles in UX Research: Canopy Tax

Profiles in UX Research: Canopy Tax
By: Jeffrey Steen

Screen Shot 2016-05-17 at 2.05.12 PMCanopy Tax launched in summer 2014 as a cloud-based practice management and tax resolution solution designed to help practitioners with workflow, client communications, document management, and client invoicing. In addition to facilitating management, Canopy offers a tax resolution solution that helps practitioners guide clients through the IRS collections process.

Product Team Lead Nate Sanders talked to Validately recently about how user recruiting and moderated remote testing have been integral to product development since the early days of the company.

What was the impetus for creating a cloud-based, SaaS tax solution for tax professionals?

Screen Shot 2016-05-17 at 2.12.38 PMShortly before Canopy Tax launched, our founder, Kurt Avarell, was working as an attorney handling tax resolution for clients. The problem was, he found himself drowning in seven or eight different tools as he worked. They were all necessary, but they didn’t work together. Based on this experience, he saw the need for a product suite with all of this functionality in one place.

Also, tax software has remained relatively stagnant in recent decades, so there was plenty of room for innovation—particularly in the cloud.

Tell us about how you weave user research through development and iteration at Canopy.

Our main goal is to ship theories instead of hypotheses. Everything about our product development and discovery process closely mirrors the scientific method for this reason. This has been critical to us since day one.

The first thing we do when we’re pursuing a new idea is diving into user and market research. We spend a lot of time with our users understanding what “jobs-to-be-done” they have. We want to make sure we understand pains they experience in their work everyday, what possible product offerings would add value to their work, and what would make us competitive in a large market.

Once we’re confident we have a solid understanding of what will be valuable to our users, we dive into crafting a hypothesis. The product manager, engineers, and UX designers all collaborate together to drive towards the right solution. We then create high fidelity, and highly functional prototypes and validate our ideas with our users. Working off the feedback and observations we iterate through several rounds of hypothesizing and testing until we’re confident in the solution we’ve created.

When did you decide to utilize Validately’s moderated remote testing for this development?

6963131702_a1037a4a0f_b-e1410934533354We’ve used Validately for about a year now. Steve Cohn, Validately’s CEO, spoke at a conference around that time, and we decided to switch from Blue Jeans video conferencing to Validately’s more robust moderated remote testing tools.

Does this supplement in-person testing?

We haven’t really done a lot of in-person testing, to be honest, and we don’t feel we need to. We can get more user breadth and talk to more people using the video and audio engagement of remote testing.

You’re also using Validately for user recruiting. What does that process look like for you, and what kind of user are your targeting?

When we first started user recruiting, we Googled names of tax professionals and compiled lists of contact information. Then, we cold-called each one and asked them to engage in a testing session—while also trying to convince them we weren’t selling anything. That was rough, and it sucked up a lot of our time. After bringing on Validately, it only made sense that we’d utilize their user recruitment. We worked with them on a screener to be sure we were getting the right user base, then turned our focus to product improvement.

Once you have your users lined up, how do you structure your tests?

Before we ever start testing, the UX Designer will prepare a testing script  for each session. The script details a series of task based scenarios that can deductively tell us whether the user can accomplish their goals with the solution we prototyped. We prep our users that we’re not going to help them very much, and that they might struggle with the tasks we give them. It’s crucial for us to be able to see where the user struggles, what’s painful, and what we’re doing exceptionally well. Throughout the entire session we ask the user to “talk aloud” and give us feedback on our prototype.

Do you conduct similar testing for live products?

Not a lot. Because we do everything we can to incorporate discovery and user testing throughout the entire development life cycle, we don’t have a lot of usability issues when the product ships. We measure twice, and cut once. We solicit and normalize feedback after every single product release, and fine tune everything we possibly can. We do some testing to focus on “shoelacing” the different features within our product suite to ensure everything within Canopy is intuitive and works together seamlessly.

Speaking of which, what is the balance of qualitative vs. quantitative feedback you’re receiving during user testing?

We really value goal directed design, and we feel it really defines our user experience strategy. Under goal directed design, you’re working to understand the user’s vision and desired end state, and then you make sure you do everything you can to ensure arriving at that end state is as delightful and enjoyable as possible. Given that approach, we focus a lot on observing how well the user can accomplish their goals, and how satisfied they are doing it. We balance that with qualitative feedback. Because Validately’s remote testing allows for both audio and video recording, we can easily note every reaction a user has.

What are some Canopy Tax features you have improved using moderated remote testing?

Screen Shot 2016-05-17 at 2.14.13 PMA core part of our application is completing and filing IRS tax forms. Filling out tax forms is something that really hasn’t changed for decades, and we wanted to enhance the entire user experience for these tasks. We used Validately to test and iterate through several different prototypes to find out what would be most valuable to practitioners as they completed these tax forms. Based on our findings we saw that practitioners were entering the same information multiple times, and that they needed an easy way to see how all the data they were entering flowed onto the IRS tax form. Our final solution cut down the amount of data entry for practitioners dramatically, and allowed them to quickly toggle between a web form and the IRS tax form.

What features of Validately were particularly helpful in delivering these improvements?

Capturing faces during video recording is huge for us. We can watch facial expressions—which is invaluable feedback that’s often lost when you’re just doing audio recording, or when doing in-person testing and the tester is not paying attention. When users are silent, these facial expressions indicate how they’re feeling and show us where product features are not intuitive or where certain features are a pleasant surprise.

The ability to flag moments within videos has been huge, too. It makes it easy to revisit specific user responses that indicate pain points. The only thing that we would like to see is the ability to tag or title flags, so we can then query them to see how often certain topics/tags appear in a testing session. [Editor’s note: This is a feature that is on Validately’s short-term development road map.]

For those who are looking to engage in user research, what recommendations or suggestions would you offer?

In short: measure twice, cut once. Don’t fall victim to the farce that user testing takes too much time, or it will slow you down– it’s a lie. Your job is to create products people want and love. What you come up with on your own isn’t enough. It’s a conjecture. Get outside and validate your assumptions. The product development team at Canopy has moved faster than any team I’ve ever been a part of because of this philosophy. With a lean testing tool like Validately, we have been able to uncover prototype pain points quickly and move from shipping guesses to shipping theories.


Screen Shot 2016-05-17 at 2.15.15 PMNate Sanders has been Product Team Lead at Canopy since January 2015. Previously, he worked as a UX designer for BambooHR. A UX designer for most of his career, Sanders is very much in favor of a design philosophy that centers on user experience, instead of clinging to inflexible waterfall development.




Want to learn more about Validately?


Profiles in UX Research: Ghostery

Written By: Jeffrey Steen 

Screen Shot 2016-05-02 at 8.06.20 PMLaunched in 2009, New York-based Ghostery has spent the last seven years developing solutions designed to increase the visibility of digital tracking technologies. On the consumer side, Ghostery offers a browser extension that gives web users the opportunity to control who gathers information on their browsing habits. On the enterprise side, businesses can refine their digital presence by using Ghostery to streamline ad deployment and ensure advertisers are not delivering malicious or unexpected content.

Emmy Southworth, Senior Experience Designer with Ghostery, chatted with Validately recently about how continued growth of these products is dependent on user research, and how that research is executed.

What product is your primary focus at Ghostery?
Over the last nine months, I have worked mostly on the Ghostery browser extension. This extension allows users to see what tracking technologies are observing their online behavior, serving up ads, or slowing down websites. But more than just an informational resource, Ghostery allows users to customize their browsing experience by deciding which tracking technologies to block and which to allow.

For an existing product like the browser extension, how does design iteration happen and at what points do you employ user research?
User research is a part of the entire process. Using an existing user base, we conduct surveys and record interviews to determine what new features would be helpful and what existing features need improvement. We also interview users of competitor products to get more insight into the market. After this, the product team reviews recordings and survey results to determine what the next steps should be.

Screen Shot 2016-05-02 at 8.23.48 PM

Ghostery used Validately moderated remote testing to garner two key bits of feedback: the Firefox extension needed a more graphical treatment (as seen above), and the initial use of red should be used to highlight which tracking technologies are blocked and not simply active during browsing.

When you have an idea of what these new features will look like, at what point do you bring new designs back to the user for further testing?
Fairly early on. We often introduce wireframes or working prototypes to users after we map out new features to see if we’re on the right track. This is where moderated remote testing is helpful, as our team members are located in different offices. With Validately’s tools, we are able to connect all of our team members during testing sessions to ensure everyone can observe user reactions.

Who is involved in these moderated remote testing sessions?
The product manager and myself will usually be in the room with the user during testing. Developers or remote team members will connect to sessions from their desks via Validately.

How do you structure the testing sessions?
I like to use rapid, iterative testing. I set up the same test with three or four different users and ask the same questions of each one. Using Validately, I can flag moments in the interview that are either successes or areas where our concepts could use improvement. During the testing session, I encourage users to speak out loud and explain what they’re thinking and why they’re making certain decisions. This helps us determine how we should restructure features or shift designs to make our product more intuitive. Thanks to Validately, we can go back later and reference flagged areas in the recorded session.

Can you give us an example of a specific feature or function that changed as a result of Validately-based moderated remote testing?
We recently launched an update to our Firefox extension, designed to make our interface more intuitive and appear less “techy.” In a dashboard visualization, I used the color red to show certain types of tracking technology. This created some confusion; many users thought this meant the technologies were already blocked, when in reality, it was simply informing them about which types of technologies were being used. After observing user reactions to this, we decided we needed to switch the color to something more informative.

Screen Shot 2016-05-02 at 8.21.48 PM

Ghostery’s Firefox extension (v 5.4 at left, v 6.3 at right), shows how new labels help clarify which tracking features are blocked for users, and how each type of tracking technology is classified.

How was Validately key to this discovery – and other product updates?
Validately’s video recording sessions allowed us to clearly observe user reactions and pinpoint product weaknesses. I’ve worked with some tools that add their logo to the screen as a watermark, and to be honest, it interferes with testing. Because Validately doesn’t have that, our software can be the sole focus of user testing. I don’t have to worry about other variables, and that’s huge.

What advice would you give to smaller companies looking at launching effective user research?
Utilize moderated remote testing for rapid feedback and lean software creation. It gets the conversation out of the boardroom and focuses on the user. Also, be sure to get a good recruitment process in place so that finding users to test doesn’t become something that slows you down. Validately offers to help with recruiting, but you need to be proactive in finding users to test your software. Create and maintain a list of people who want to help. Recruiting is worth the investment.

How did you hear about Validately?
Our product managers were using Validately when I joined Ghostery. They discovered it at a tech conference. We plan to keep using it, as it has made our testing—and consequently, our design—very efficient.

# # # 

Screen Shot 2016-05-02 at 8.17.15 PM

Emmy Southworth has been the Senior Experience Designer at Ghostery for nine months. Before that, she was the UX Director for Workfront. She has been working in UX/UI design for 15 years.


Profiles in UX Research: Dan Slate of Stride Health

Written by Jeffrey Steen

Picture1In the complex world of healthcare, it’s critical that individuals have a simple way to find, purchase, and use their health insurance. Stride Health addresses this challenge by offering the world’s only mobile-first health insurance recommendation engine, designed to help individuals find and purchase a health plan.


Stride Health Team

In order to deliver this experience, Stride’s UX team had to conduct well-designed user testing. In the second segment of our ongoing series on user research, Dan Slate, Stride’s Lead Product Manager, discusses how Validately’s Lean model of user research and intuitive approach allowed for significant improvements to the UX design of Stride’s product.

Picture1What motivated you to conduct user research?
As is true for most product teams, user research has been critical to our process of uncovering key customer insights, ultimately leading us to product improvements.

Why did you choose Validately to help facilitate that research?
Conducting user research on the process of shopping for and purchasing health insurance requires that we put users in the right context to provide relevant feedback. Validately’s tests enabled us to do this easily while also rapidly experimenting with different designs to determine what was most intuitive.

One of the two testing models you used was a Moderated Discussion. How did you structure that and what were your goals?
The Moderated Discussion allowed us to work through the entire onboarding process for new users. Each person builds a profile based on unique medical and personal data that help us assess their health needs and recommend the best plan for them to minimize their total healthcare expenses.

Requiring users to complete this experience prior to answering questions about the plan recommendation and comparison experience was critical to putting them in the right context. Validately enabled us to do this in our live product so we could observe user interactions in a very organic way. The Moderated Discussion also enabled us to maintain some structure to the test so we could prompt the users with questions at specific points to gather the feedback we needed.

What feedback stood out during these tests?
We consistently heard that our users wanted to know what else we considered before making our recommendation. They also wanted to know what the cheapest plan was as a benchmark for comparison, and if there were similar plans from an insurance company they felt had a better brand. In short, we learned that presenting various insurance options helps our users determine if the plan we recommend is actually the best plan for them.

How did this change the UX design of Stride?
We now show two options alongside our recommended plan. These options reflect our research insight, offering a plan with the lowest premium for cost-conscious shoppers, and one similar to our recommended plan but offered by a different insurance company.

Given these new options, did you encounter new design challenges during testing?
Yes. We heard lots of users say they had difficulty comparing plans. While side-by-side comparison of physical goods is quite easy, it’s hard to do in the digital sphere, particularly on mobile devices and for information-rich products like health insurance.

How did you approach user testing to solve this comparison problem?
We looked at usage data for our product and noticed users were digging deep into a plan’s details, then backing all the way out to select another plan to dig into. Some users would repeat this behavior five times before settling on a final choice. The conclusion was obvious: It was time to take on the design challenge of a mobile-first side-by-side comparison for health plans.

This new feature allowed our users to place two plans side-by-side on their mobile phone and compare the specific benefits of each. We wanted to eliminate the need to wait until you’re back around a computer to do research on your options or read the fine print of each plan.

Screen Shot 2016-04-04 at 9.51.35 PM

Stride’s current plan recommendation and comparison screen (L) alongside the old recommendation screen (R)

Part of navigating the health insurance world is understanding complicated terminology. Did users voice this concern during testing?
They did. While we pride ourselves on putting health insurance terms into language that users understand—like translating “deductible” to “the amount you’ll pay out-of-pocket before your insurance company starts to pay for services”—we knew we had to go a step further and get detailed user feedback on how plan information is communicated.

What was your approach?
Using Validately’s Unmoderated Talk-Aloud tests, we assembled plan cards that detailed the benefits of various health insurance plans using different formats and terminology. We paired these with simple questions to get user feedback.

After sourcing testers from Validately’s user panel by targeting demographics most similar to our own user base, we shared the cards with each test participant. We recorded video and audio reactions to each card, noting the moments when users paused or struggled—or showed very clear understanding. This research helped us better understand the information that was successfully and quickly communicated, and what information was hard to parse.

How did these insights influence the product?
Our research led to two key changes. First, we updated the plan summaries with a few additional data points and changed the visual layout to make them easier to scan when stacked in a list. This saves our users time when reviewing different options so that they can figure out which plans are worth exploring. Second, we added explanatory sections to key data points to help educate users in the context of the decision they were trying to make about a plan. For example, we made it clearer that our estimated care costs are a forecast of out-of-pocket expenses on top of monthly premium payments. This number is designed to help our users assess each plan in the context of what they might spend in a normal year, so they can find one that minimizes their costs.

Are there metrics you can point to that show this new development/design has been successful?
The top line metric we watched was the user conversion rate—that is, how many users chose to purchase a plan on Stride before and after redesign. For us, that’s a proxy for how much trust and confidence users have in our recommendations.

We also look at qualitative feedback. Do users feel like they are more educated after using Stride? Do we see a reduction in the common questions we received and sought to address in the redesign? Did people respond with understanding instead of confusion?

Screen Shot 2016-04-04 at 9.48.36 PM

Stride’s current comparison screen for all plans (L) alongside the old comparison screen (R)

Would you say Validately was key to these developments?
Absolutely. Validately achieved two things we hadn’t been able to previously. First, it allowed us to more effectively measure how well our product was building trust and confidence with users. Because Validately allowed us to watch user reactions as they were using our product, we were able to gather valuable qualitative feedback that led to successful redesign. Second, Validately allowed us to frame our tests so that users were placed in the right context—specifically, the annual purchase of health insurance. This meant that the feedback we received was far more actionable.

What prompted you to use Validately for this user research?
A design contractor told us about them, and recommended we give their testing a try. We all agree it’s been very successful, and we’ll continue to use them for research as we hone product design and user experience.

Picture1Dan Slate leads Product at Stride Health in San Francisco. Before joining the team at Stride, he spent several years building healthcare and financial services products at Intuit, Atigeo, and Narayana Health.