Big Data: Inspect What You Expect

With the seemingly unstoppable growth of advertising technology and the rise of automated media-buying systems, we have developed an incredible reliance on data. Nobody could be more excited than I am about all the interest and power being given to the massive amounts of data available to marketers today. However, we need to start paying the same level of attention to the quality of our data sources and collection systems.

This is especially crucial in today’s multi-agency environments, where we are not always in control of the collection systems – instead, we often have to trust the quality of the data without being able to verify the sources. This blind trust can be a dangerous practice that can create false insights, not to mention harm client-agency relationships and brand success. If we continue to leverage data and analytics to direct our media investments, we need to become more educated and informed about the quality and sources of our data.

Take search as an example. Brands are investing millions in search, but when it comes to correct tagging of URLs and websites, most brands fall short. I can’t count the number of times we looked at search data and discovered that the URLs could not handle paid search tags, or that we had missing or duplicate analytics tags in place. Marketers optimize against conversion data coming from the analytics platform, but how often is it that the agency that buys the media also controls the tagging? This begs the question: why do so many brands invest millions in media, but still shy away from investing in data quality and control?

I have seen many brand-agency relationships suffer from issues caused by bad data. In one instance, the client was double tagging some of their landing pages, resulting in inflated page views and visits. When the brand launched a new site with improved tagging, traffic and engagement dropped dramatically.

For weeks, we tried to track down the cause of the problem. Multiple (competing) agencies got involved and in the end, another agency found the cause of the issue: multiple trackers were being placed on the same page. The upset client had to go back to his leadership and admit that the high level of traffic they thought our work was producing was actually heavily inflated, and the ROI was not as good as he had been taking credit for. Nobody did anything intentionally evil, but it took a while to recover from the error, and search and content spend have still not returned to normal levels.

Obviously, this is an extreme, but even a missing or broken tag on a single page can cause damaging issues. A big problem with “Big Data” is that most people only look at the aggregated summary on a report, which is shortsighted. Why? Because if there is data missing from a single landing page, it might not stand out in the aggregate, but it surely affects the overall results negatively.

We recently launched a new approach with our clients to help them fight this trend; we are no longer spending in digital media unless we have verified the correct tagging, filters and analytics configuration. As an agency that is compensated based on percentage of media spend, it’s not an easy choice, but in the end, we believe that we need to be an advocate for our clients and ensure that they are investing as efficiently as possible. We believe that this long-term strategy will benefit our clients’ bottom lines, as well as our client-agency relationships.

I believe that brands should take a percentage of every working dollar and invest it into data tagging and quality. Only when we start at the moment the data is being created, and work our way back from there, will we create truly actionable insights that will help inform our brands’ strategies and create true ROI.

Related reading