Just in the last couple of months, we have dealt with some tracking questions and issues. Throughout my career, these types of issues have persisted among agencies, media vendors, ad servers, Webmasters, and clients. One thing’s clear: the terms and procedures around tracking have not been standardized. As a result, there’s lots of confusion in the marketplace around tracking and analytics.
This two-part column will examine five common tracking and reporting issues and offer sensible approaches and explanations to deal with them. The tracking issues are:
- Why online reports aren’t 100 percent accurate.
- Why view-through conversions are not conversions.
- How double counting occurs in different systems.
- Why tracking from different systems don’t always have to match.
- How to compare first-click conversions versus last-click conversions.
In this column, I will dig deep into the first two items and follow up next time with the other three.
Why Online Reports Are Not 100 Percent Accurate
Everyone should understand that online reports are not 100 percent accurate. And in reality, they don’t need to be to facilitate campaign optimization. Most tracking systems are based on cookies. Some percentage of people do things like crank up their security settings and occasionally clear out their cookies. Some actions and pixel requests misfire or malfunction and don’t get counted. As much as we want technology to be perfect, it isn’t.
As a result, not all actions or sales that occur from a click are captured. This is OK because the tracking isn’t for accounting or financial reasons. Instead, it’s designed to give us near-time or real-time directional data on where to invest our ad dollars. It’s for fact-based decision making. Before this level of tracking data was available, people had to wait days or months to analyze campaign result. Retail guru John Wanamaker famously quipped, “I know half my advertising is wasted. I just don’t know which half.” With online tracking, we can get that directional information and focus our advertising investments into what works and minimize waste — that’s the point and we can do it without being 100 percent accurate! (But don’t think I don’t care that it’s not 100 percent accurate; I’m simply stating how it is right now.)
Why View-Through Conversions Are Not Conversions
Early on, I was very supportive of view-based conversions. I still am — if their true purpose and use is accurately presented to the client.
View-through conversions or view-based conversions are measured from cookies set by a banner on a browser exposed to that banner and where the person didn’t click on that banner. Then if the person ever comes to the client site another way, such as from a search query or direct hit, the view-through metric would report any conversion by the person as a view-based conversion. The idea, as I understood it, was that people who were exposed to online ads converted at a higher rate compared to site visitors who had not been exposed to ads. I support that notion and desire to prove that banner impressions lift conversions; it shows the value of online ad impressions from an influence standpoint.
Somewhere along the way, however, the use and definition of view-based conversions got fuzzy. These conversions were reported alongside click-based conversions and attributed to the media source with the same weight as click-based conversions.
For example, a lot of banner networks have added view-based conversion to their metrics and some are presenting it as if they are somehow responsible for those conversions simply because an ad they served set a cookie on the viewer’s browser. It doesn’t even mean the person saw the ad (e.g., if it was below the fold of the browser page). Furthermore, if you’re running run of network (RON) ads across the Web, chances are that a huge percentage of Web site users will be exposed to your banners and get a view-based cookie set on their browser. When they later come to your site via a direct hit, paid search ad, or any other way and convert, that network adds a conversion or action to its total and takes credit for it. This leads to a couple of very disturbing and distorting data points.
First it grossly inflates the network’s report conversions and ROI (define). I’ve taken over many accounts where a client had raved about a network’s conversion/ROI success. As soon as I showed them how to drill down on the conversions to take out the view-based conversions, the whole story changed. It wasn’t good news, but it was critical to show the real picture. Agencies, sites, or networks shouldn’t take credit for view-based conversions in the same way they take credit for click-based (where the banner was actually clicked on) conversions.
Second, when you count view-based conversion the same as click-based conversion, you can end up double-counting conversion when you bring all your data together into a consolidated dashboard. (I’ll explore this next time.)
Remember: don’t count view-based conversion as conversions! View-based conversions are an indicator of banner advertising’s influence on a consumer’s propensity to convert, and the conversions you see being reported on shouldn’t be part of your total conversions — real conversions should have some sort of source click attached to it.
Be sure to read my next column when we’ll continue this important discussion. Of course, reach out to me with your comments!
Meet Harry at Search Engine Strategies New York March 23-27 at the Hilton New York. The only major search marketing conference and expo on the East Coast, SES New York will be packed with more than 70 sessions, including a ClickZ track, plus networking events, parties, training days, and more than 150 exhibitors.
As Facebook keeps changing its news feed algorithm, one constant factor is the domination of video content and so brands keep experimenting with ... read more
As more and more users turn to ad blockers, is there a way publishers can convince them to turn them off? The ... read more
There’s a significant increase of video content this year, and as it still hasn’t reached its peak, we’re analysing the most popular ... read more