Search

 
icon for podpress  Screen Space 19: Usability & Usability Testing 101 Part 5— Conducting the Testing [16:17m]: Play Now | Play in Popup | Download

[Podcast Transcript]

Welcome to Screen Space, your podcast about creating usable, accessible, effective, and efficient web, blog, and digital media design for the everyday (and non-expert) designer. This is episode 19 of Screen Space “Usability & Usability Testing 101 Part 5—Conducting the Testing.” In this episode, I discuss the fourth step of usability testing—what to do before, during, and after the testing. There will be one more part to this series, where I will discuss analyzing and utilizing the results from the testing.

If you have not listened to the previous parts of this series, you may want to go back and listen. In the first part, Screen Space 11: Usability & Usability Testing 101, I discuss usability, provide a definition of usability testing, and outline the steps to conduct a usability test. In Part 2, Screen Space 12: Usability & Usability Testing 101 Part 2—Selecting Users, you can find information on selecting your users for usability testing. In Part 3, Screen Space 17: Usability & Usability Testing 101 Part 3—Deciding what to Test, I discuss the steps to setting objectives and selecting tasks to test. In Part 4, Screen Space 18: Usability & Usability Testing 101 Part 4—Preparing the Testing, I provide information on getting ready to do the testing. You may also find Screen Space 10 on User-Centered Design helpful.

I am your host, Dr. Jennifer L. Bowie. I conduct research and have taught in areas related to digital media, web, and blog design. Previously I mentioned being an assistant professor at GSU. However, this is no longer the case and I am currently looking for a job in usability, user-centered design, and/or social media. Stay tuned and I’ll provide details at the end of this podcast.

Welcome, welcome, welcome my new listeners from Germany, India, and Australia.  Thanks for listening and design well!

In this episode, I present the fourth step in usability testing: conducting the testing. This includes what to do before, during, and after the testing. I will use the same example I used in episodes 11, 12, 17, and 18—testing a photography blog. We’ll imagine we have a photography blog with a decent sized audience. We want to get more users and see how useable the blog is for the current users.  By this point in the series we have figures out which user profiles we will test (part 2), we have designed the testing (part 3), and we have prepared for testing (part 4). So, let’s figure out how to conduct usability testing. There are three steps.

Before the Testing

It is test day and the environment is setup and we are ready to test. In walks our first participant.  Our first step is to greet and brief them.

Step 1: Greet & Brief participant

  • Read/say welcomeIn part 4, we wrote our welcome and intro to the testing.  Here is where we read it. Remember it is important to treat each user the same, as to not impact the testing, so try to make the welcome as consistent as possible.
  • Emphasize that you are not testing them: While we call this usability testing, this is not a test of our users. This is not something they can pass or fail. This is a test of our site, and a test our site can pass or fail. Our participants, however, will often think this is a test of them and will often try to “pass” the test. This will change the test results we have, as regular users do not try to “pass” tests when using websites or other media. So, you need to emphasize that this is not a test of the participants, but of the product and that they should act as natural as possible. You can tell them that this is a test of the site and they are helping you determine how usable it is. I like to also emphasize that what I learn from watching them use the site will help me make the site more usable in the future.
  • Explain think-aloud protocol (if you are using it): Think-aloud protocol is a fancy name for having the participants think out loud while they are testing. This is a controversial thing to have participants do. Those for it say it gives you a good idea of what users are thinking while they are thinking it, which can be very helpful in usability testing.  However those who are against it have many valid points. Think-aloud is not natural. Most of people to not, for example, think out loud when they are using websites. This means our users use of the websites will change, at least a little, because they are thinking out loud. It may slow them down and they may do things differently. Also, even when doing think aloud many people will not vocalize all of their thoughts. They may be thinking swear words but they will self-edit into something else that seems more proper.  So, a “what the beep, this is a beeping beepy site” may become “Huh? This site is not good.” We get the negative comments, but not the full extent of the negatively. In addition, think-aloud may be harder for shy people. Often people against think-aloud suggest you video tape the testing and then watch it with the participant right after, asking them to tell you what they were thinking. This takes more time and they may not be as accurate remembering what they were thinking. So, figure out if you want to use think-aloud and if so, explain this to them. I tend to go with think-aloud as it is quicker and does give an idea of what the participants are doing and why.
  • Emphasize how user tells you she has completed a task: It is very important that you have a way for participants to tell you when they have completed a task. Without this, you will not know when to stop timing or know when they have moved on. It is also good for them to have some method of showing they are done, because without it, they may start the next task and then go back to the earlier tasks as they think of something else to do. However, when they have shown they are done with a task they are more likely to not go back to it. I put in each task a statement asking them to tell me when they are done. You could also ask them to indicate it by closing the window, putting a  pen down, writing something down, doing 10 jumping jacks, whatever. I’ve found the “done” statement works quite well. Here is an example of what one of the photography blog tasks would read like with my “done” statement.

You know that this blog provides photography tips and you recently took a picture in low light that did not turn out very well. See if you can find a photography tip on the blog that will help you take better low light pictures. When you have found this tip, please say “done.”

It is a good idea to make sure this is discussed in the welcome statement, even if you include the information in the tasks.

  • Stress that the testing is anonymous: There is no need for your testing to not be anonymous and many reasons for it to be so. Your participants will feel more comfortable if they know it is anonymous and that you will not be blogging about your dumb user Jane who can’t even find the search engine. Anonymous participants are the norm in ethical research and the anonymity provides a certain level of protection. So, make sure your participants are anonymous in any reporting of your testing and makes sure you tell them this will be the case. It is a good idea to make sure this is part of the welcome statement and to put it on the consent form.

Step 2: Have the user sign the consent forms

Once you have welcomed the participant and explained the testing, have them read and sign the consent forms. These should be laid out and ready to be signed with a pen nearby.

Step 3: Conduct any pre-testing surveys or interviews

In part 4 of this series, you designed your pre-testing surveys or interviews. Once you have welcomed your participant and she has signed the consent form, give her the pre-testing survey or interview.

During the Testing

We are now ready to conduct the actual usability testing! Make sure the participant knows where the tasks are or how they will get them and have them start the testing.

Observe and record data:

You, or whoever has this task, needs to record your data and observe the testing. Set those cameras or microphones up to record the screen, facial expression, participants’ statements, and so on. Take notes on what they do and what they say. Record the time it takes for each task, if this is data you are collecting. Record other data you are collecting, like success rate, wrong clicks, and so on. You cannot analyze the data without recording it in some way. So, make sure you record it.

Act appropriately:

During the testing make sure you:

  • Are unbiased (especially the Facilitator/Briefer): This may be the hardest step. When testing it is key that you do not act in a biased way. If you are testing your website, blog, or other media, you likely have a lot invested. It is your “baby” and you may have a hard time hearing people speak poorly of it. However, if you react badly when someone calls your “baby” ugly then this may change how the participants act and thus impact your results. It is also not uncommon for those conducting the test to know of problems or have their own issues with the site, like hating the color scheme, and they may ask the user questions or in some way direct the user to feel and say similarly. Do not do this! So, be as unbiased as possible. This way you will get the best results.
  • Intervene carefully:  Generally you should avoid intervening in the testing. It is hard to watch a participant struggle to find something on the site and not point to something they’re missing that would help. However, you will not be standing over the shoulders of your users when they are struggling to find this in the real world. You need to see how people really use your site and watch how they solve problems. So, do not help them. Sometimes users will ask you questions and these can be very hard to not answer. Come up with some neutral answers that will direct them back into performing as they would on their own. You could say something like “how would you do this at home?,” “How would you solve this on your own?,” or even say something like “It will help me to most to see how you would figure this out on your own.”

After the Testing

Once the user has finished your tasks, you have three things to do:

  1. Follow-up on the testing: Should you have any questions from the testing—maybe why the user did something—now is the time to ask these questions. If you did not do think-aloud protocol, this is when you review the video with them and ask what they are thinking or in some other way get this information from them. Give them also a chance to ask questions.
  2. Conduct any post-testing interviews or surveys: If you have any post-testing interviews or surveys, developed in part 4 of this series, now is the time to give them.
  3. Debrief user: Once you have finished the testing and conducted any post testing interviews or surveys, debrief the participant. Give them any additional information, pay them, let them know if they will be contacted again, and generally wrap things up. Thank them and send them on their way.

And, with that you have successfully, I hope, conducted your usability testing. Now repeat for each participant and good luck! Let’s review the steps to conducting testing:

Before the Testing

  • Greet & Brief participant
  • Have the user sign the consent forms
  • Conduct any pre-testing surveys or interviews

During the Testing

  • Observe and record data
  • Act appropriately

After the Testing

  • Follow-up on the testing
  • Conduct any post-testing interviews or surveys
  • Debrief user

That concludes “Usability & Usability Testing 101 Part 5—Conducting the Testing.” Join me next week for the final part of this series: “Analyzing and Utilizing the Results.”

As I mentioned in the intro to the podcast, I am looking for a job. As my loyal listeners may be able to guess, I am interested in a position in usability, user-centered design, and/or social media, or another academic position teaching these areas. My preference is in the Atlanta area or telecommuting, though I may consider locations somewhat nearby. If you are interested in my skills or know someone who is please contact me at jbowie@screenspace.org and check out my portfolio at www. screenspace.org/port.

If you have questions, comments, or thoughts on what you want me to cover please send me an email at jbowie@screenspace.org or check out the Screen Space blog—www.screenspace.org. You can also follow Screen_Space on Twitter for hints, tips, advice, news, and information on  designing websites, blogs, and other digital media texts. Also, check out the blog for a transcript of this podcast complete with links and resources.

Have fun and design well!

Screen Space is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. So, feel to give a copy to all your users in the debriefing, but don’t change the podcast, do give me and Screen Space credit, and don’t make any money off of it.

Screen Space’s opening music today is “African Dance” by Apa Ya off of Headroom Project and the closing music is “Survival” by Beth Quist off of “Shall We Dance”. Both these selections are available from Magnatune.

Episode 19 Links and References:

Past Screen Spaces podcasts you may want to refer to:

Other links:

Magnatune: http://www.magnatune

Subscribe!

Subscribe to the blog
Subscribe to the podcast

 
icon for podpress  Screen Space 19: Usability & Usability Testing 101 Part 5— Conducting the Testing [16:17m]: Play Now | Play in Popup | Download

Something to say?