How to write an effective app usability test script
Here at ArcTouch, acquiring user feedback has always been central for how we build apps that both deliver business value for our clients — and offer a delightful user experience. We do this, most often, through app user testing. Our latest ebook, “A practical guide to app user testing,” offers step-by-step guidance for designers and product teams on how to conduct user testing in both moderated and unmoderated (online) tests. In this post, I cover one of the key ingredients: how to write an effective usability test script.
A great usability test script is both simple and instructive
Why is it important that moderators have a script to follow in user tests? For one, you want to make your tests as standard as possible. This ensures that you don’t bias an individual participant’s responses. And making sure that each user answers the same questions ensures that you can see similarities — and differences — in your user responses, which will help you draw stronger conclusions from your study.
Ultimately, a script should be simple but provide enough instructions to help the facilitator guide the participants during a moderated session — or provide all the information the participants need to know in unmoderated (online) sessions. Here are the sections a script document should include, with recommendations on what to include in each of them:
✔ TIP: To make the script easier to read during a moderated test, you can use a different text style or color to separate instructions from the actual script that will be read to the participant.
1. Background information
This section is intended for facilitators and anyone who reads the usability test script but not shared with test participants.
We typically include background information about the test dates, the client, number and type of participants, version history and testing methodology. We also describe what you are testing, the goals of the test, and how much time it should take per participant. It is also good to include information about participant incentives and compensation.
Example of an introduction section from an ArcTouch client user test script.
This section provides a script for the moderator to welcome participants and give them an overview of what to expect in the test. If the test is moderated, we also introduce the facilitator and any companies involved.
At this point, it’s good to let the participants know there are no right or wrong answers. This will help them feel more comfortable. Also, encourage users to talk aloud while they perform the tasks. You’ll want to understand their thought process, their actions, and how they are feeling.
It’s also important to ask for consent to record the session. Make sure to get explicit verbal consent to do so.
✔ TIP: In an in-person moderated test, you should give the participants the opportunity to ask any questions before starting.
3. Pre-test questionnaire
In this section of the user test, your primary goal is to make your participants feel comfortable.
In the usability test script, you’ll include questions about basic information, such as name, age, occupation, and any other demographic data relevant to your test. During moderated tests, simply requesting confirmation (“Please confirm your name is [NAME].”) of information you already have should be enough.
Then ask any general questions related to your project. For example, if you are testing a travel app, you can ask how often the participants fly and if they use any apps to book and plan trips.
4. Tasks and scenarios
Tasks are the actions you will ask participants to take on the device or other testing interface. Each task should have a goal.
Creating scenarios helps participants engage with the interface and imagine how they would use them in real life. A user goal and task might read like this:
- Test goal: Browse hotel rooms and book one.
- Poor task example: Book a room at a hotel.
- Better task example (with a scenario): You’re planning a vacation to Bangkok from September 3 to September 14. You need to book a hotel for your stay. Go to the app, review the information and book a room that you think is best.
✔ TIP: Always allow the users to go back and read the task as many times as they need.
Tips to write better tasks
Trying not to bias user behavior is difficult, but is important if you want to get useful results. Here are five tips to write more effective, neutral tasks:
- Make the task realistic to help participants engage with the interface. Create scenarios that mimic the real world as much as possible. Don’t force them to do something they wouldn’t normally do. For example:
- Test goal: Browse sale items on levi.com.
- Poor task example: Purchase a pair of white, high-rise Levi’s jeans on sale.
- Better task example: Purchase a pair of Levi’s jeans for less than $20.
- In a real-world context, users would probably browse before choosing what they want to buy. In the first example, we are not giving the user the opportunity to browse for what they would normally choose. Instead, we are telling them what to do. They will focus on finding the pair of jeans we told them to find, and may not engage with the interface the way they typically would.
- Don’t force participants to interact with a specific feature. Instead, attempt to find out how they would choose to use the interface. This way, you’ll learn if they can find the features and use them as you expected.
- Test goal: Find a recipe.
- Poor task example: Use the search bar to find an Indian recipe.
- Better task example: You want to cook Indian food today. Use the app to find a recipe.
- Avoid adding clues or describing the steps. It is better to offer context for the scenario than to provide clear steps to achieve the goal. The idea is to let participants navigate the interface by themselves.
- Test goal: Keep track of the progress on a book.
- Poor task example: You want to update the progress on the book you’re reading. Go to the app, search for the book, add it to your list and update the progress you have made.
- Better task example: You started reading a new book and don’t want to lose track of where you finished last time. Use the app to update your progress.
- Don’t bias participants using the same language they can easily find in the interface you’re testing. For example, if the interface showcases a button with the label “Join Free for a Month,” you shouldn’t be using this same wording in the task:
- Test goal: Try service for free.
- Poor task example: Go to the website and join for free for a month.
- Better task example: You want to try this service for the first time. Go to the website and sign up.
- Be as direct as possible with how you phrase the tasks. Look at these two examples:
- Test goal: Book an appointment.
- Poor task example: Now, see if you can try to find a way to book a nutritionist appointment.
- Better task example: Now, find a way to book a nutritionist appointment.
- The first one, with the extra phrase “see if you can try to,” gives the impression that the task will be difficult to achieve or even hint that may be expecting a certain response because the interface has a poor experience or design.
✔ TIP: If necessary, you can ask questions between tasks. It is better if you wait until the participant finishes the task so you don’t distract them.
5. Post-test questionnaire
With the core test complete, the questionnaire portion of the usability test script guides the moderator to ask follow-up questions. In this section, the moderator will want to probe in areas where responses were unclear or ambiguous.
You can ask participants high-level questions about the product or a specific feature, or if they expected something to be different. You can also ask open-ended questions about new features they would like to use.
In this section of the usability test script, you’ll thank the participants for their time. And you can ask users if they have any questions for you. Last, you should confirm with your users the details of their compensation.
Want to perform user testing on your app?
Contact us today to learn how we’ve successfully performed user testing during product development — and to explore how user testing can help you.