Whose numbers do you trust? Tom believes that question may soon be a bit easier to answer.
Two weeks ago, I wrote a piece on the Interactive Advertising Bureau's (IAB's) effort to address the issue of spiders' and robots' effects on reported traffic. Working with ABC Interactive (ABCi), the IAB is making a list of spiders and robots to make it easier to filter this activity from traffic reports. At the time, I made a bit of a gaffe when I explained that this list was sorted by IP address. Thankfully, ABCi was quick to call me on it, and I had a more in-depth conversation with the organization about how this database will help the industry.
The database utilizes user-agent strings, and not IP addresses as I had originally claimed. User-agent strings are what identify browser clients to Web servers. For example, a user surfing the Web with Netscape 4.0 might return a user-agent string containing "Mozilla/4.02." Spiders, particularly the ones originating from the search engines, usually identify themselves through the user-agent string. A spider from AltaVista might contain "Altavista" within its user-agent string, making it easily identifiable. This makes the IP address matching issues I mentioned in the prior article somewhat moot.
In speaking with a representative from ABCi last week, I also got some insight into how the organizations envision the database being used. Their tool is meant to pick up on what they call commercial spiders -- the ones that are in regular use by search engines and other online information services and that make a material impact on traffic reports. The database is not meant to be a catchall for all types of spiders and robots. ABCi was careful to make a distinction between commercial spiders and personal ones. An example of a personal spider might be the bot written by a computer science student, as in the example I gave in my article.
The database wasn't designed to be used as a real-time filtration system. Subscribers can use the data to filter traffic reports based on the information contained in it, but it may not be suitable for real-time filtration of things such as advertising activity reports. This raises the question: Ad servers have been filtering spiders by observing their behavior for years. Would it not be better to filter spiders at the server level by observing their behavior?
Perhaps getting rid of these spurious numbers might best be achieved by a combination of solutions. Behavioral filtration can take care of the "little guys," while the ABCi database filters the ones that are known for causing the big problems.
What we know for certain is that cached ad activity, diverse counting methodologies, and the presence of spiders and robots are the leading causes of discrepancies between advertiser-side and publisher-side ad activity reports. Many agencies are able to synch their activity reports with those of publishers to within 10 percent by addressing the first two issues. Based on my past experience with spiders and robots, I can say that this mechanical activity may be responsible for a good chunk of the remaining 10 percent.
If advertisers and publishers can get their reports synched up fairly well, we will have made significant progress toward addressing the counting methodology problem that has plagued our industry since ad serving first appeared on the scene.
On the heels of a fantastic event in New York City, ClickZ Live is taking the fun and learning to Toronto, June 23-25. With over 15 years' experience delivering industry-leading events, ClickZ Live offers an action-packed, educationally-focused agenda covering all aspects of digital marketing. Register today!
Tom Hespos heads up the interactive media department at Mezzina Brown & Partners. He has been involved in online media buying since the commercial explosion of the Web and has worked at such firms as Young & Rubicam, K2 Design, NOVO Interactive/Blue Marble ACG, and his own independent consulting practice, Underscore Inc. For more information, please visit the Mezzina Brown Web site. He can be reached at email@example.com.
Hong Kong, May 5-6, 2015
Gartner Magic Quadrant for Digital Commerce
This Magic Quadrant examines leading digital commerce platforms that enable organizations to build digital commerce sites. These commerce platforms facilitate purchasing transactions over the Web, and support the creation and continuing development of an online relationship with a consumer.
Paid Search in the Mobile Era
Google reports that paid search ads are currently driving 40+ million calls per month. Cost per click is increasing, paid search budgets are growing, and mobile continues to dominate. It's time to revamp old search strategies, reimagine stale best practices, and add new layers data to your analytics.
May 6, 2015
12:00pm ET/9:00am PT