Blog

Dive into User Testing

An important theme in many teachings of the design craft, user testing can be a bit elusive with “real world work,” very often because stakeholders haven’t ever had a chance to appreciate its value when developing products.

Having finally completed my own first, successful foray into user testing, I put together the process we used to plan, execute and conclude the research. Here’s the story of our own application and the user testing steps we followed.

user testing

Step 1: Define the problem(s)

The team was handed an in-progress MVP application that was being developed by a team outside our group.

“Help us fix it,” they said.

After a first pass (and major struggling) to figure out the system already made, it was apparent that many pieces weren’t working. While our team was confident lots of small tweaks would make a major difference, there were still huge questions at large. How would we determine the biggest gaps and problem pieces? What is the most vital aspect to focus on to ensure a successful MVP? We also set out to determine if small changes really could make a huge difference.

Problem statement 1: What is the biggest blocker to users understanding the software?

Problem statement 2: Would a side by side comparison of the live site to a “small tweaks” improved prototype prove that little changes can have a big impact?

Step 2: Develop a hypothesis

While finding the problem might give you as the tester a starting place, developing a hypothesis will guide the script for the upcoming user interviews.

For context, our application was meant for use by developers, and there were a lot of complex moving parts, as well as unclear naming conventions, definitions, and flows to get from one task to the next.

Hypothesis 1: The inconsistency and sparse, disorganized information of the live site will cause a lot of confusion in users, resulting in getting stuck.

Hypothesis 2: Improvements to consistency and design patterns will help users along the flow with less confusion in the prototype version.

Hypothesis 3: Regardless of small tweaks, users will continue to face confusion at certain pages, and will struggle to fully understand the task they’re being asked to complete.

Step 3: Come up with ulterior goals

Unlike scientific research, user testing profits from being able to be flexible with conducting interviews and approaching the problems themselves. By coming up with goals, space is left open for the possibility of unforeseen issues or solutions going into the testing. It also gives you as the tester something positive to work for as testing proceeds.

Goal 1: See an improvement from the “Live Site” to the “Prototype” in how easily and confidently users are able to move through the task at hand.

Goal 2: Discover any additional areas that don’t match expectations or cause users to get stuck or confused.

Step 4: Find your participants

Depending on your product and its stage of development, the users you need may vary a lot from application to application.

Because the intended user for our application was a developer, we were lucky enough to wrangle some users from our own office. We chose six participants, a couple of whom had some experience working with the application we were testing, but the rest did not.

user testing

Step 5: Define the methodology and script

The methodology defines how you’re going to tackle your problem and prove or disprove your hypothesis.

Our application had a very specific path to accomplish the task of deploying a sample app. It involved going through a defined set of screens, many that required configurations to move on to the next step. We challenged our users to work through the task. The end result would be for users to deploy a sample application on the application we were actively testing.

The beginning script went something like this:

“As a developer, you are proceeding with a task you’re familiar doing on similar tools. You’ve just discovered this new tool and want to try it out by going through the app’s sample flow. Deploy a sample app.”

We began with the live site and did help the users along the way when they got stuck. After the task was accomplished, we had a series of questions and used a Likert scale (rating the questions 1–5 by difficulty). Users were then asked to accomplish the same task using the prototype (we used Invision), again followed by the same questionnaire. Users were asked what differences stood out to them the most, as well as if there was any additional feedback.

Step 6: Test

Next came the fun part. After researching some various user testing tools, our team settled on Lookback, a tool created by and for Spotify’s own designers.

Lookback has a desktop application that will record the screen during the interview while the camera records users’ reactions. You can later add comments where needed in the recordings as well as create snippets of important pieces that can be shared with others. This saves your report audience tons of time from having to go through every minute of the recordings.

This suited our purpose well because we were conducting in person, mediated tests from one laptop. Tests took about 45 minutes to an hour.

Step 7: Review the results

Many of the things we hypothesized would be pain points were in fact correct, but we also received some very affirmative feedback on the improvements from the live site to the prototype.

As a user testing newbie, one of the most enlightening and satisfying things to experience during the tests was the unexpected feedback and ideas around what could be better. As a person who would not be a user of this particular application, my understanding of some areas were very gray and fuzzy, but the developers were able to add a lot of definition and thoughts behind what they might have expected in certain place in the flow, or things that just didn’t quite measure up.

These observations were pulled together in the form of taking key snippets out of the recordings, calling out specific and relevant quotes, and keeping tight notes on what happened during each user test.

user testing

An overlapping graphic showing the difference in results for the Likert scale questionnaire showed how significant the small changes between live site and prototype had been.

user testing

Step 8: Make recommendations

When putting together recommendations for the team in charge of driving the future of a software product, it’s important to stick close to unbiased observations made during the tests. What actions would have a direct impact on the observations and pain points of the app?

In our case, all three of our hypotheses turned out to the true. The inconsistent and disorganized flow often left users confused and unsure what their next move should be. While there was a significant improvement with the prototype, it didn’t solve everything. The overall recommendation coming out of testing was to implement the changes from the prototype into the live site. Specific areas however would need much deeper problem solving in order to redirect the flow of the app. This work would be crucial to the success of the product.

Step 9: Write the report

For our final report, we wanted it to be easy to read and visually interesting. We backed up our observations and hypotheses with quick access to users’ recorded reactions. Check out an example of Lookback snippets here.

We picked up an amazing report template from UXPin as our baseline, and I’d highly recommend you check it out along with their other resources.

user testing

 

Lessons Learned

Success! My first official round of user testing was concluded. Not only was crucial feedback pulled from the interviews and users, many of the design teams’ hypotheses were in fact correct and backed up by quality evidence.

By conducting thorough user testing and asking users to accomplish a specific task, many pain points were revealed. With that feedback comes specific direction as to where the development team should take the product next.