Your Web analytics reports may be pumping out insights that are not exactly what they seem. Here are some factors to consider before acting on that information.
When working with Web analytics data, I sometimes think of it as an iceberg. On the surface, everything looks fine and we know what we are dealing with. However, most of the data is under the surface and we're not always sure what dangers are lurking beneath. As tools such as Google Analytics and Yahoo Analytics have brought Web measurement and reporting to the masses, more people are exposed to the part of the iceberg above the waterline. Good user interfaces can make access to the data in a Web analytics system easy and intuitive, but they can also manufacture an impression of quality that may not be appropriate. Be sure to look beyond the pretty graphs and reports and understand the quality of the data that's being collected and how to interpret it.
A good example was a pitch process we were involved in recently. We were given access to the prospective client's Web analytics tool and were asked to come up with some thoughts and recommendations based on the data. Our recommendation, probably not the one they were expecting, was that they don't use the data because there were a number of fundamental flaws in the way the Web analytics system had been implemented. As we started to look through the reports and the data, we quickly realized the data were close to meaningless. It was nearly impossible to extract any of the "insight" that they were looking for. The root cause of the problem could be fixed relatively easily, but it made me think about how often it's possible to fall into the trap that if it looks right/good then it probably is right/good, which in this case it patently wasn't.
Of course, I have no problems with improvements in Web analytics interface design. The easier that people can access the data, the better. But perhaps Web analytics tools should also come with some kind of health warning (or possibly a wealth warning as well) that all may not be as it seems. Neat interfaces and visualization techniques can generate an illusion of accuracy and robustness that may not be there. Often, Web analytics tools are poorly implemented due to constraints in time, resources, and money. With poor implementations comes poor data. Even the best implemented Web analytics systems are vulnerable to the constraints of cookie-based data collection methodologies.
There's no getting away from the fact that what we are tracking with our Web analytics tools is not people but devices. So if the same person visits your Web site from two or more different devices, they will look like two or more different "people." On the other hand, if two or more people access the Web site using the same account, then they will look like the same "person." This potentially has a big impact on the way you might interpret the data. In some recent work we did, we were able to match cookie values against an account number and see how many accounts had multiple cookies associated with them. In this particular case, about 10 percent of accounts had two or more cookies associated with them, but they represented more than 20 percent of all the cookies or "visitors."
An example of how this might impact your data? Let's say you're in the kind of business where people spend time researching before buying or transacting. Sometimes you can see a peak in conversion rates from first-time visitors on certain days (often Mondays). You might think this is the result of some particular marketing activity, but it more likely is a feature of the buying behavior where people do their research on one machine over the weekend and then complete the transaction on a different machine when they get to work the following day.
Web analytics data are not perfect, and in many cases these issues that can impact the data are less of a problem if you are more concerned about monitoring trends than looking at the absolute values. But it's worth bearing in mind where the data come from and how those reports are constructed when you admire the pretty interface!
Neil Mason is SVP, Customer Engagement at iJento. He is responsible for providing iJento clients with the most valuable customer insights and business benefits from iJento's digital and multichannel customer intelligence solutions.
Neil has been at the forefront of marketing analytics for over 25 years. Prior to joining iJento, Neil was Consultancy Director at Foviance, the UK's leading user experience and analytics consultancy, heading up the user experience design, research, and digital analytics practices. For the last 12 years Neil has worked predominantly in digital channels both as a marketer and as a consultant, combining a strong blend of commercial and technical understanding in the application of consumer insight to help major brands improve digital marketing performance. During this time he also served as a Director of the Web Analytics Association (DAA) for two years and currently serves as a Director Emeritus of the DAA. Neil is also a frequent speaker at conferences and events.
Neil's expertise ranges from advanced analytical techniques such as segmentation, predictive analytics, and modelling through to quantitative and qualitative customer research. Neil has a BA in Engineering from Cambridge University and an MBA and a postgraduate diploma in business and economic forecasting.
US Consumer Device Preference Report
Traditionally desktops have shown to convert better than mobile devices however, 2015 might be a tipping point for mobile conversions! Download this report to find why mobile users are more important then ever.
E-Commerce Customer Lifecycle
Have you ever wondered what factors influence online spending or why shoppers abandon their cart? This data-rich infogram offers actionable insight into creating a more seamless online shopping experience across the multiple devices consumers are using.
September 9, 2015
12pm ET/9am PT
September 16, 2015
12pm ET/9am PT
September 23, 2015
12pm ET/ 9am PT