Bad data mean bad news for your analytics program -- and your business.
It was one of those moments. I was working on a client's data and began to suspect something was wrong. Not with the client's business, but with the data itself. The potential business implications were significant. As I dug deeper and deeper into the issue, I got that sinking feeling. Something was seriously wrong.
The business in question was looking to aggressively improve its digital channel's effectiveness, focusing on conversion optimization as traffic levels were quite buoyant. It had implemented a satisfaction tracking survey to understand visitor intent and satisfaction. It also had commissioned usability testing to understand the user experience in more detail and had started a testing and experimentation program. It was all the right stuff.
But there wasn't any strong evidence that conversion was actually improving, so the business needed a deeper dive into the data to find out what was happening. That's when the problem emerged. Without going into the gruesome details, I discovered the business's conversion rate was being underestimated and that the degree of underestimation had been getting worse over time. The historical data from the Web analytics tool was wrong on some key metrics; however, the conversion rate was better than previously thought.
The really bad news was that the business had probably been focusing on the wrong problem. While all the activity on conversion optimization was good stuff, the revised data highlighted that other issues may have been more pressing. Worse, the data's credibility was seriously undermined and, to some extent, the team's credibility was compromised as well. For conversion optimization, it was taking one step forward and two steps back.
We marketers must get the right numbers right and keep them right. When it comes to marketing optimization, good quality data is a core component. Getting good quality data that allows better decision making is a key step on the journey. That might seem like an obvious statement, but it's not a process that should be underestimated, nor is it a one-off set-up event.
When a new system is implemented, there's inevitably a focus on the data it's generating. That information is then reconciled against other data sources. That's great, but those reviews must be repeated at regular intervals to ensure data integrity remains high. If this isn't a managed, ongoing process, the data's integrity may decline over time until something happens that causes the data to be questioned. Then it might be too late.
Managing data integrity is a messy job, but someone has to do it. Good processes will certainly help ensure that all pages get tagged, campaigns are tracked properly, and so on. Technology is available that can help check for tags on the site and other tag management challenges. One must also have a keen eye to look at the data for trends and patterns that may not be a true reflection of what's going on. I think this is a skill that can be learned.
A good marketing analyst can sense when something doesn't look right. In my experience, if something looks odd, it probably is odd and isn't real behavior. Sudden changes in trends, steps in the data, spikes, and dips are all potentially symptomatic of artificial impacts on data. If they can't be explained by real-world events, it's worth digging into the data to see if there's anything untoward happening, like changes to the tool's configuration, new site monitoring tools being put in place, changes to the hosting environment, and so on.
Getting good data integrity is not a one-off event. It's an ongoing process. Be wary of the potential impact that changes to your site or tracking environment will have on your data and plan accordingly. Take time to reconcile your data on a regular basis to see if there are any divergent trends. With these basic processes in place, you might avoid that sinking feeling at some point in the future.
Today's column originally ran on June 24, 2008.
Neil Mason is SVP, Customer Engagement at iJento. He is responsible for providing iJento clients with the most valuable customer insights and business benefits from iJento's digital and multichannel customer intelligence solutions.
Neil has been at the forefront of marketing analytics for over 25 years. Prior to joining iJento, Neil was Consultancy Director at Foviance, the UK's leading user experience and analytics consultancy, heading up the user experience design, research, and digital analytics practices. For the last 12 years Neil has worked predominantly in digital channels both as a marketer and as a consultant, combining a strong blend of commercial and technical understanding in the application of consumer insight to help major brands improve digital marketing performance. During this time he also served as a Director of the Web Analytics Association (DAA) for two years and currently serves as a Director Emeritus of the DAA. Neil is also a frequent speaker at conferences and events.
Neil's expertise ranges from advanced analytical techniques such as segmentation, predictive analytics, and modelling through to quantitative and qualitative customer research. Neil has a BA in Engineering from Cambridge University and an MBA and a postgraduate diploma in business and economic forecasting.
US Consumer Device Preference Report
Traditionally desktops have shown to convert better than mobile devices however, 2015 might be a tipping point for mobile conversions! Download this report to find why mobile users are more important then ever.
E-Commerce Customer Lifecycle
Have you ever wondered what factors influence online spending or why shoppers abandon their cart? This data-rich infogram offers actionable insight into creating a more seamless online shopping experience across the multiple devices consumers are using.
September 9, 2015
12pm ET/9am PT
September 16, 2015
12pm ET/9am PT
September 23, 2015
12pm ET/ 9am PT