Usability Testing

One of the most valuable tools in helping ensure that your application is actually solving the problem you set out to solve is to get the application (or a prototype, even on paper!) in front of an actual user of that application (or a potential user, or anyone who will hold still long enough to try it out).

Planning for the test

When setting up your usability test, remember that you need to follow these steps:

Pick the tasks you want to test

You should focus on a small number of tasks (three is a good number) that you would like your users to complete. These tasks should be for things that are important to your application, rather than things that are ubiquitous - having someone test your login, or registration process, unless you have done something unconventional, is unlikely to yield many insights - it is a task that people have likely done many times, in many contexts. It's much more useful to have them execute tasks that are unique to your value proposition. If you are building a platform for sharing recipes, having them create a recipe, or look for a recipe that they want to make, is more useful than having them set a profile picture.

Build the tasks in to a scenario

Create a context for the user to imagine as they work through your tasks, which will give them relevant information to complete them. To expand on the recipe example, if the task is to find a particular recipe, you might create the following scenario: "Imagine that you need to put together dinner in a hurry, because you need to get to a club meeting in 40 minutes. You're allergic to peanuts, and you are trying to cut carbs. See if you can find a quick, low carb recipe that does not contain peanuts". This will help provide information to ensure that the test is exercising filters, tags, searches, etc - without explicitly mentioning those elements or giving instructions on where to find them or how to use them.

Create a script

You should write a script that you follow for each test. This keeps you honest in a few ways - you won't forget elements of your test, and you also won't be tempted to lead the user based on previous results. The script should outline the scenarios, but it should also achieve a few additional goals.

Set the context

Why are you running this test? What do you hope to achieve? How long will it take, and are there any rules they need to follow?

Set them at ease

Let the user know that they can't do anything wrong - this test is to help you improve, and if they get confused, or break things, or can't accomplish one of your tasks, that's your application's fault, not theirs! You won't take any comments personally, and in fact are happy to know about any issues they can help you find.

Ask for permission to record

Usability tests are the most useful when the whole team can see them - but having a whole team breathing down your neck makes testing awkward. Ask the user for permission to record the process (you don't need to record the users face), so you can share the results with the team later

Help establish the user's demographics

Ask some questions about the users experience or habits that relate to your application. If it is a recipe site, do they like to cook? Do they cook frequently? Have they used other recipe sites online?

Say Thank You!

The users gave you their time and insight. If you were doing this professionally, you would probably give them some sort of compensation, but since you're students, make sure you're at least saying thanks for their help!

Examples

There are many examples online of test scripts, some good ones are

Running the test

During the test, you should avoid guiding the user - this defeats the purpose! If you show them where to click, or tell them what to look for, you will not know if they would have been able to discover that on their own, or how long it would have taken if they did. This is difficult! If the user asks you questions, avoid giving answers related to the test, and remind them that they can't do anything wrong.

It's also important to remind them to talk as they run through the test - how did they decide where to click? What did a menu option suggest to them? Where were they confused, and where did they run into unexpected things?

When you're running the test, even though you have a recording (a great way to do this is to use zoom in a recorded session and have the user test with screen share on), it's good to pay attention to nonverbal cues as well - are they scowling? Do they look confused? Are they bored?

What comes next?

You will almost certainly have plenty of feedback to use to drive changes to your next sprint. Look for common areas where your test subjects had difficulty, were confused, or mentioned that they had alternate expectations!