Organizations must focus more heavily on the user experience, developing a more rigorous understanding of what their customers want online and determining how best to deliver that to them. Good customer insight is core to that process and insight comes from a range of systems, methodologies, and techniques. We’ve already looked at the use of quantitative approaches to customer insight, so let’s look at some of the more qualitative approaches.
For years in “offline” marketing, quantitative and qualitative approaches have been used side by side to understand consumer preferences and behaviors. Only relatively recently in the online channel, businesses are pulling together these disparate sources of insight to get a fuller picture into what’s happening in the online channel and why.
Part of the problem has been technological — it’s been hard to integrate data. But another part of the problem has been organizational — different functional silos are focused on different aspects of the customer journey or customer experience.
Here in the U.K., and I’m sure it’s the same in the U.S. and other more sophisticated digital economies, we’re beginning to see roles and job titles such as “director of customer experience.” As business functions become integrated, more business and marketing data is getting integrated.
Joined-up thinking requires joined-up data. And integration includes the “blending” of hard quantitative-based data with softer qualitative data collected using a variety of predominantly observational techniques.
Web site usability testing has been a core tool in Web site development for many organizations. Still, I’m constantly amazed by how many organizations don’t do any usability testing either during or after product development. While they spend large amounts of time and money on developing sites or functionality, they don’t test whether typical users can actually do what they wanted or expected to do.
Simple usability testing can tell you a lot about why things don’t work that you would never learn by staring at Web analytics reports. Given that the respondent will often talk about experience time while on the site, you get to see why things may not be working, and hear it from the horse’s mouth.
Usability testing techniques are evolving and methodologies, such as eye tracking, are becoming standard features of most tests rather than expensive optional extras. Eye tracking shows where the user is looking. Combined with other data, such as a click map from a Web analytics system, eye tracking is useful for page level optimization requirements in merchandising and promotional work.
One criticism about usability testing is that labs can be an artificial environment for observing the user experience. As a result, we’ve seen more use of “ethnographic” (define) style research in which customers are observed interacting with Web sites in their “natural habitat” (i.e., home or work).
For one piece of research we conducted for a retailer, we went to customers’ homes to see how they managed to use the Web site in their own environment. Pictures fed back to the client showed users balancing laptops on their knees on the sofa or standing in the kitchen with the laptop on a worktop.
We might think that customers are focused on a site. Reality is, they could be in an environment that is full of noise and distractions, giving a different perspective on the kind of experience they may be having.
Some newer and more innovative qualitative approaches to understanding the user experience include techniques such as electroencephalography (EEG) to try and measure emotional engagement. This approach uses brain-scanning techniques to monitor subconscious responses in the brain when users are subjected to different stimuli on a Web site. How do they emotionally react, for example, to different types of messaging, images, or layouts? It’s like eye-tracking on steroids.
With all these innovations, though, the fundamentals remain the same: good user experiences can’t be built in a vacuum and without a deep insight into, and empathy with, the goals, aspirations, and expectations of our customers. The data, tools, and techniques just help us to get there.
While ad fraud has become part of every marketer’s vocabulary, attribution fraud—the practice of gaming outdated attribution models to justify self-serving means—has ... read more
When you’re just starting out as a business owner it’s easy to become wrapped up in the seemingly endless number of metrics ... read more
Something I’m asked frequently at conferences and from marketers is what metrics they should be striving for from their social media marketing. ... read more