Whose numbers do you trust? Tom believes that question may soon be a bit easier to answer.
Two weeks ago, I wrote a piece on the Interactive Advertising Bureau's (IAB's) effort to address the issue of spiders' and robots' effects on reported traffic. Working with ABC Interactive (ABCi), the IAB is making a list of spiders and robots to make it easier to filter this activity from traffic reports. At the time, I made a bit of a gaffe when I explained that this list was sorted by IP address. Thankfully, ABCi was quick to call me on it, and I had a more in-depth conversation with the organization about how this database will help the industry.
The database utilizes user-agent strings, and not IP addresses as I had originally claimed. User-agent strings are what identify browser clients to Web servers. For example, a user surfing the Web with Netscape 4.0 might return a user-agent string containing "Mozilla/4.02." Spiders, particularly the ones originating from the search engines, usually identify themselves through the user-agent string. A spider from AltaVista might contain "Altavista" within its user-agent string, making it easily identifiable. This makes the IP address matching issues I mentioned in the prior article somewhat moot.
In speaking with a representative from ABCi last week, I also got some insight into how the organizations envision the database being used. Their tool is meant to pick up on what they call commercial spiders -- the ones that are in regular use by search engines and other online information services and that make a material impact on traffic reports. The database is not meant to be a catchall for all types of spiders and robots. ABCi was careful to make a distinction between commercial spiders and personal ones. An example of a personal spider might be the bot written by a computer science student, as in the example I gave in my article.
The database wasn't designed to be used as a real-time filtration system. Subscribers can use the data to filter traffic reports based on the information contained in it, but it may not be suitable for real-time filtration of things such as advertising activity reports. This raises the question: Ad servers have been filtering spiders by observing their behavior for years. Would it not be better to filter spiders at the server level by observing their behavior?
Perhaps getting rid of these spurious numbers might best be achieved by a combination of solutions. Behavioral filtration can take care of the "little guys," while the ABCi database filters the ones that are known for causing the big problems.
What we know for certain is that cached ad activity, diverse counting methodologies, and the presence of spiders and robots are the leading causes of discrepancies between advertiser-side and publisher-side ad activity reports. Many agencies are able to synch their activity reports with those of publishers to within 10 percent by addressing the first two issues. Based on my past experience with spiders and robots, I can say that this mechanical activity may be responsible for a good chunk of the remaining 10 percent.
If advertisers and publishers can get their reports synched up fairly well, we will have made significant progress toward addressing the counting methodology problem that has plagued our industry since ad serving first appeared on the scene.
What's New for 2015?
You spoke, we listened! ClickZ Live New York (Mar 30-Apr 1) is back with a brand new streamlined agenda. Don't miss the latest digital marketing tips, tricks and tools that will make you re-think your strategy and revolutionize your marketing campaigns. Super Saver Rates are available now. Register today!
Tom Hespos heads up the interactive media department at Mezzina Brown & Partners. He has been involved in online media buying since the commercial explosion of the Web and has worked at such firms as Young & Rubicam, K2 Design, NOVO Interactive/Blue Marble ACG, and his own independent consulting practice, Underscore Inc. For more information, please visit the Mezzina Brown Web site. He can be reached at email@example.com.
Singapore, 3-4 November
Hong Kong, 8-9 December
Hong Kong, 8-9 December
Google My Business Listings Demystified
To help brands control how they appear online, Google has developed a new offering: Google My Business Locations. This whitepaper helps marketers understand how to use this powerful new tool.
5 Ways to Personalize Beyond the Subject Line
82 percent of shoppers say they would buy more items from a brand if the emails they sent were more personalized. This white paper offer five tactics that will personalize your email beyond the subject line and drive real business growth.
December 9, 2014
1:00pm ET/10:00am PT