When the Numbers Don’t Match…

Folks send me questions pretty frequently, and they give me a great picture of both what’s going on in the industry on the ground and how I’ve failed to address some of those goings-on.

Last week, I received an email from reader Nicole Cochran asking why sites report rather different click-through numbers than her client’s site receives. We addressed some of this issue a few weeks ago in “That Sneaking Impression” and “That Sneaking Impression: Part II,” but we failed to dwell on the click-through.

Nicole writes, “We are currently running a campaign for a client. We did not use a third-party ad server and so all tracking is coming through the ad servers of each individual site. Our client is also tracking click-throughs on their end using the tracking URLs we set up for each creative execution by site. Our first week tracking report revealed that the sites were reporting double the number of click-throughs that the client reported. I understand how served impressions can get completely fouled up, but isn’t a click, a click?”

If you other readers haven’t run into this situation, it’s only because you employ a banner server (and therefore, you don’t realize that your numbers are incorrect), or you simply haven’t been keeping track of client site traffic. Nicole’s issue is endemic to the industry.

I can determine three reasons why these numbers don’t match up:

  1. Some people click on the banner at your media site but stop the page or click elsewhere before your client’s page can load. This has to account for a very small portion of the discrepancy. It is behaviorally rare and technically unlikely, unless your client’s site is really quite slow. However, I do hear this as the most common excuse cited by sites. And well they might, as it suggests that the discrepancy is valid and that their medium is delivering all the contracted impressions and clicks.

  2. The user double-clicks on banners, and the media site is either dumb enough or dishonest enough to count both clicks. This is extremely common behaviorally, and I’m not aware of what the common counting policy is at sites. I, of course, assume the worst. I have to suspect that most of Nicole’s discrepancy comes from these simple counting methodology issues. Others would include counting search engine robot traffic (which can mean many, many added clicks) and traffic from the site, client, and agency. Some sites particularly the ones that offer “optimization” packages, where they observe performance to winnow down to the optimal banner or media placement tend not to include all this superfluous traffic because it hinders their accuracy in optimization.
  3. The site is baldly defrauding buyers. This, fortunately, is the least likely scenario. I’ve come across it in the distant past, back before the West was won, but not at all recently. You might be surprised at which organizations tended to harbor the most dishonest reps, as they weren’t necessarily the small shops.

So what’s a poor media buyer to do? Let’s approach this question in two ways. First, when purchasing media, you want to hold the sites’ feet to the flames. Performance-based media works to incite them to give you quality media, but the real performance is determined by traffic seen at your client’s site, not theirs. This means you should try to make the performance criteria based on your client’s numbers. Unfortunately, not many sites are willing to do this, but it’s worth asking.

That’s why we have to pursue accuracy in reporting in another way. We purchase media frequently with inaccurate performance reporting, but we have to collect the proper numbers separately in order to make intelligent decisions on creative and media performance. This entails getting your client’s click numbers by media buy and creative so you can make smart choices.

Here’s a real-world example of why this is important. People new to the web tend to double-click on links and banners more than experienced users who tend to single-click. So if you compare a site that tends to have a greater proportion of newbies looking about, you’re going to find, magically, a higher click rate. But it will be inaccurate. And if you reward the newbie site with more media dollars as a result, you’ll be throwing those dollars away. If you were able to compare the clicks from your client’s own data, you’d be able to see the true relative performance.

Eventually, enough buyers will be savvy enough to reward the sites with the greater true performance and all sites will use better reporting methodologies. But, for now, we live in a media environment in which we buyers are along for the ride in a race to the bottom of reporting methodology. No matter the method, the higher numbers win. We just need to be patient, and, in the meantime, employ our own numbers to gauge performance.

Related reading