Organizations must avoid adopting a one-tool-fits-all approach to understanding and measuring the online customer experience. Here are some tools that can help with quantitative reviews.
Organizations must be more customer centric when developing their online channel. Part one addressed that point, outlining a simple framework to help organizations improve the quality of the customer experience.
Underpinning this framework is the need for a range of quantitative and qualitative measurement and analytical techniques. That's because customer insight is a key component of delivering improved customer experiences. Let's take a brief look at some tools in the customer experience toolkit.
The same way that organizations can no longer afford to develop one-size-fits-all approach to their online channel, they must also recognize that they can't have a one-tool-fits-all approach to understanding and measuring the online customer experience. For far too long, businesses have relied solely on Web analytics tools to track and measure what's happening on their Web sites.
When the only tool you have is a hammer, every problem looks like a nail. Organizations sometimes get frustrated with Web analytics tools because they try to use them for tasks they aren't very good at. In terms of understanding the user experience at its simplest level, the questions you're trying to answer are:
These questions must be answered, as they form the foundation of most analytical inquiries. It's also as important, for example, to understand why things don't happen.
Web analytics tools are great for answering the "what" and "when" questions, but less great for answering the "who" and "why" questions. To answer these questions, we need other tools -- some quantitative in nature and others more qualitative.
Some kind of survey or direct user experience feedback tool is really important. This is the true "voice of the customer." A basic approach is to be able to answer the questions:
The last two questions are pivotal. We aren't talking just about questions of site functionality, but about the whole experience -- or the user's perceptions of that experience and what it should be like. This may have nothing to do with the Web site's functionality.
The last question is where they may invite users to comment in words rather than just tick boxes or provide scores or ratings. This is a rich vein of information that truly gets at the heart of the customer experience. Text mining tools can help you shift through these written comments and pick out the main themes and patterns in what people are saying. They can be good.
But the best text mining tool is the human brain. By quickly scanning through the various comments people leave, you can get a real sense of the issues. These may be nothing to do with the Web site itself, but may be around the product offer or availability, for example.
One travel company I've worked with had a major problem with customer satisfaction whenever it sent out customer direct mail. Problem was, offers advertised in the letters weren't easy to find or weren't replicated on the Web site, mainly due to a lack of communication between two departments. Customers expected a seamless experience, but it wasn't being delivered.
Reading and taking notice of user comments is so useful that I set up scheduled reports for clients to drop the comments into their e-mail inboxes every Monday morning.
Also, so-called customer experience management tools are useful for capturing and analyzing individual users' session. The capability to track and replay an individual's actual session on the Web site has been around for a while. Tealeaf is probably the most established provider in the market, but other players are emerging, albeit from different start points.
The challenge with these types of tools can be the sheer mass of data captured and available for analysis. It can be like looking for a needle in a haystack. The trick is to have some way of uncovering potential customer experience issues and then diving in to the session replay data to see what actually has been going on.
A useful development is the integration of these systems with other measurement systems such as Web analytics tools or voice of the customer programs. This allows the analyst to start at a higher level and then drill down into the detail, for example, by isolating all the sessions where the overall satisfaction score was less than five and then reviewing a selection of them in detail to see what exactly seemed to be causing the problem.
Most of these techniques have been quantitative in nature or have started from a quantitative approach. Quantitative methods are necessary but rarely sufficient to understand the user experience.
Next time, we'll look at a selection of qualitative approaches to understanding and optimizing the user experience. Until then...
August 10-12: Revolutionize your digital marketing campaigns at ClickZ Live San Francisco! Educating marketers for over 15 years, our action-packed, educationally-focused agenda covers every aspect of digital marketing. Early Bird rates available through Friday, July 17 - save up to $300! Register today.
Neil Mason is SVP, Customer Engagement at iJento. He is responsible for providing iJento clients with the most valuable customer insights and business benefits from iJento's digital and multichannel customer intelligence solutions.
Neil has been at the forefront of marketing analytics for over 25 years. Prior to joining iJento, Neil was Consultancy Director at Foviance, the UK's leading user experience and analytics consultancy, heading up the user experience design, research, and digital analytics practices. For the last 12 years Neil has worked predominantly in digital channels both as a marketer and as a consultant, combining a strong blend of commercial and technical understanding in the application of consumer insight to help major brands improve digital marketing performance. During this time he also served as a Director of the Web Analytics Association (DAA) for two years and currently serves as a Director Emeritus of the DAA. Neil is also a frequent speaker at conferences and events.
Neil's expertise ranges from advanced analytical techniques such as segmentation, predictive analytics, and modelling through to quantitative and qualitative customer research. Neil has a BA in Engineering from Cambridge University and an MBA and a postgraduate diploma in business and economic forecasting.
US Consumer Device Preference Report
Traditionally desktops have shown to convert better than mobile devices however, 2015 might be a tipping point for mobile conversions! Download this report to find why mobile users are more important then ever.
E-Commerce Customer Lifecycle
Have you ever wondered what factors influence online spending or why shoppers abandon their cart? This data-rich infogram offers actionable insight into creating a more seamless online shopping experience across the multiple devices consumers are using.