Alternatives to Testing User Experience

My most recent article discussed the importance of usability testing to guide and assess web-development efforts. More and more, companies are realizing that if customers find the user experience on a site lacking, they’ll go elsewhere.

In the article, I focused on qualitative one-on-one testing to guide web development. I am a strong proponent of this methodology; there is no substitution for close observation of real customers using your site. But many readers wrote to me, asking about web and software-based systems that gather user experience data online. I promised them an answer, so here it is.

These types of systems roughly fall into two categories: those that gather visitor feedback and those that measure the technical aspects of site performance. While neither of these alternatives can replace qualitative usability testing, they can provide valuable information to assess user experience.

In the category of visitor feedback, there are some interesting alternatives. Vividence is a service that brings consumers from your target demographic through your site and solicits feedback from them along the way. The company then analyzes their feedback and click stream, and the ability of the consumers to successfully navigate the site. All this data is presented in really fancy graphs and charts with comparisons to benchmarked information. Vividence costs about $20,000 to $45,000 per test.

However, unlike Vividence, which recruits from a panel of users, OpinionLab is a product that gathers data in a “natural” environment, in other words, an environment where real customers show up to your site on their own. OpinionLab places tiny links on your site which customers can click on to provide their opinions. The service is free, but OpinionLab upsells you with detailed reports and analysis. You can see an example of the software on find the bulls-eye icon in the top right-hand corner on the home page.

A drawback to OpinionLab is that the visitors who provide feedback are highly self-selected, and, therefore, the data that is collected can’t really be seen as a cross section of your customers. You almost have to stumble on an OpinionLab icon willy nilly to realize that the site wants your opinion. Self-selection is always an issue with research, and without an incentive or even an explicit solicitation for data, you can’t be sure just what type of people are speaking up about your site.

Testing the technical performance of your site using software is the second category of companies that offer data about user experience. An example in this category is WebCriteria, which sends software bots from its servers to gauge how long pages on your site take to load and how many clicks it takes to get to key areas of your site.

WebCriteria touts its “objectivity,” but the type of measurements it performs don’t really provide the rich feedback you need to gauge usability on a site. Sure, if pages take too long to download, users will be frustrated and might go away. But getting data from software bots is a far cry from hearing from your customers (unless your customers are bots, which is a business model I have yet to see).

A lot of companies are hopping on the usability bandwagon and positioning their products as quick fixes. New products pop up every day. But when all is said and done, there is no substitute for systematic observation of your customers to find out where your site might be going wrong and what you’re doing right.

Related reading