If you sit some people down and have them use your site, you'll figure out very quickly where people are getting hung up.
Good usability is properly managing people's expectations. Not surprisingly, websites that are easier for people to use almost always deliver higher conversion rates because it's easier for visitors to complete their desired task.
Usability testing is the most powerful and effective thing you can do for your website. The magic of usability testing is that it removes all the personal bias that might have gone into the development of your site, ensuring that it meets the needs of the visitor - which will ultimately also meet the business goals of your organization.
Despite this obvious benefit, many organizations skip usability testing altogether, citing lack of time, lack of money, or both. There's a common fallacy that the process has to be expensive and complicated.
Oddly, it took someone who does usability testing for a living to question that notion. In his book "Rocket Surgery Made Easy," usability consultant Steve Krug (who also wrote "Don't Make Me Think") lays out a very simple, inexpensive formula for conducting your own usability tests. According to Krug, usability testing doesn't have to be that complicated, and almost anyone can do it in about three weeks, start to finish. The underlying premise is this: if you sit some people down and have them use your site, and have them think out loud while they're doing it, you'll figure out very quickly where people are getting hung up.
Here's what you'll need to create an effective, ongoing usability testing program without much money:
That's it. No laboratory. No two-way glass. No expensive moderator.
Week 1 - Plan, Recruit, and Build Your Team
First decide what you're testing. Is it your live site, a prototype, or just wireframes? Most people doing usability testing for the first time are testing their live site. Next you'll need to figure out what tasks you want to test. Determine at least 10 (but no more than 15) tasks that a user should be able to successfully execute to get the most out of your site. Take time to select tasks based on what actual users of the site would do. These could include things such as buying a specific product in a specific color and size, downloading a white paper, or even just learning about what the organization does or finding your phone number.
You'll also need to recruit your participants. But don't get too hung up on this process. While it's good to have some people from your target audience, Krug says you don't have to worry about that nearly as much as you'd think, especially if you're starting out and getting the hang of it. "What you're looking for first of all are the problems that anybody is going to encounter, such as a confusing interface," says Krug. "It doesn't matter whether they're from your target audience or not. Your grandmother could try using the site and she would run into those problems."
Finally, figure out who from your organization should be involved in the tests. Get as many people as possible on your team, especially all the folks who have had strong opinions about the website in the past. Believe me, watching people struggle to use your site is an extremely effective way to solve internal power struggles.
Week 2 - Refine Your Plan
Now that you've put together your internal team, get their feedback on the list of tasks you developed. Make sure everyone has a say in what they think are the highest priorities.
You'll also need to develop the script you'll use when working with each participant. Krug makes this easy by providing a sample script (and other helpful resources) on his website. Once you finalize your tasks, you'll need to adapt the script so that it includes your specific tasks, but that's easy and can be done at any time in the next week.
This is also a good time to determine how you'll be doing the testing. Will participants be coming into your office, or working remotely? Either will work fine if you have the right software. If participants are coming into your office, have a webcam set up on the testing computer so that your team (who will be assembled in the conference room down the hall) can watch the volunteer and hear her comments. You'll also want to install some sort of screen sharing software (GoToMeeting works well) so that your observation team can watch as the volunteer clicks through the site. GoToMeeting is also a good solution if you run your tests remotely. It will not only allow your observation team to tune in "live," but it also gives you the opportunity to record the entire session, including the dialogue between test moderator and volunteer and how the participant navigates through the site.
Week 3 - Execution and Debrief
By the third week, you're ready to do your tests. Bring in three volunteers in one-hour shifts each. Don't worry that a sample size of three isn't statistically valid. Usability experts who have done testing for years, including Krug, agree that if you watch three people try to use your site, you're going to discover most of the more serious problems in your site. If you schedule more than three, by the time you get to the fifth person you'll be seeing the same problems arise again and again.
Take care to moderate each participant session exactly the same, following the script verbatim even though it feels weird to do so. Krug cautions against ad-libbing, because you could inadvertently provide different information to the participants that will affect the results of your test.
Meanwhile, have your team assembled in the conference room to watch and take notes. The importance of this can't be overstated. With everyone's busy schedule, it's tempting to have one person conduct the tests and report the findings to the rest of the team. Don't fall into this trap. Seeing is believing, and nobody will doubt the validity of the findings if they watch for themselves as users struggle to accomplish the most seemingly simple tasks.
At the end of each session Krug recommends you get your observation team to write down the three most significant usability issues they've observed. There shouldn't be much discussion at this point because you want to be able to compare everyone's individual observations apart from any group influence. Once the three sessions are over, order lunch and begin your debrief. Have the group discuss what they observed and look for the common themes. Together, decide on the overall top three and make a list of the most serious problems in descending order. This last point is critical. It's easy to get distracted by, and subsequently bogged down with, issues that are "easy" to fix. But remember why you are doing this yourself: you're resource constrained. Stay focused on the top issues that will make the most measurable impact on your site's usability.
No More Excuses!
By now you're probably wondering if it could really be this fast, easy, and inexpensive. The answer is - unquestionably - yes! You can get meaningful, insightful, and actionable user feedback in about four hours every month, at very minimal expense. The most difficult part of the whole process is shedding your ego long enough to value, rather than dismiss, what you learn. Remember that you and your team members are blinded by what you already know about your site. That makes it nearly impossible for you to comprehend why your visitors can't see the gigantic red button you wanted them to click, or why they don't understand that the cute magnifying glass icon means "search." Things that are obvious to you, either because you thought of them or just got used to them, may not be quite as apparent to others. And no matter how robust your analytics package is, the only way you'll learn about these disconnects is from watching real human beings use your site.
Are you ready to add rocket surgeon to your resume? Schedule a half day next month and get started. It will forever change the way you look at your site - and your visitors.
Website image on home page via Shutterstock.
This column was originally published on July 24, 2012.
This Year's Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) brings together the industry's leading practitioners and marketing strategists to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, this year's comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!
Tim Ash is CEO of SiteTuners.com, a landing page optimization firm that offers conversion consulting, full-service guaranteed-improvement tests, and software tools to improve conversion rates. SiteTuners' AttentionWizard.com visual attention prediction tool can be used on a landing page screenshot or mock-up to quickly identify major conversion issues. He has worked with Google, Facebook, American Express, CBS, Sony Music, Universal Studios, Verizon Wireless, Texas Instruments, and Coach.
Tim is a highly-regarded presenter at SES, eMetrics, PPC Summit, Affiliate Summit, PubCon, Affiliate Conference, and LeadsCon. He is the chairperson of ConversionConference.com, the first conference focused on improving online conversions. A columnist for several publications including ClickZ, he's host of the weekly Landing Page Optimization show and podcast on WebmasterRadio.fm. His columns can be found in the Search Engine Watch archive.
He received his B.S. and M.S. during his Ph.D. studies at UC San Diego. Tim is the author of the bestselling book, "Landing Page Optimization."
Connect with Tim on Google+.
The Marketer's Guide to Customer Loyalty
Customer loyalty is imperative to success, but fostering and maintaining loyalty takes a lot of work. This guide is here to help marketers build, execute, and maintain a successful loyalty initiative.
The Multiplier Effect of Integrating Search & Social Advertising
Latest research reveals 68% higher revenue per conversion for marketers who integrate their search & social advertising. In addition to the research results, this whitepaper also outlines 5 strategies and 15 tactics you can use to better integrate your search and social campaigns.
Wednesday, July 23, 2014