In late February 2008, Google's stock dropped 4 percent in one day after comScore released data suggesting the search giant's click-through rate on paid search ads was decelerating. comScore claimed that Google's growth rate had dropped from 37 percent in October '07 to 0.3 percent in January '08. Soon after, comScore spoke up, pointing out the drop off wasn't a sign of weakening, but instead due to Google's click quality programs. When Google released its 2008 first quarter earnings, it blew away all talk of weakening. Google reported strong click growth of 20 percent year-over-year and revenues of $5.2 billion, up 42 percent year-over-year.
This isn't the first time the accuracy of a ratings company's data has been called into question. For top publishers that live or die by numbers, ratings companies' methodologies often don't measure up. In September 2007, Spanish newspaper "El Pais" sued Nielsen Online for downgrading the number of unique visitors the site received. "El Pais" claimed the downgrade would result in a $1.4 million loss in ad sales because media buyers rely on panel-rating companies to make media buying decisions.
Problem is, no single measurement system is without flaws. The grey areas caused by relying on panel ratings, raw server data, tagged pages, downloaded software, and even public opinion almost ensure that wise media buyers will have only a fuzzy loyalty to the numbers.
Currently there are several ways that metrics, click rates, and traffic are gathered by ratings companies. Nielsen NetRatings, comScore, and Compete rely on panel-based measurements, whereby they extrapolate numbers from a "representative panel" of users. But, just how accurate is that representative pool for any given buy? Hitwise doesn't use panels, but instead analyses anonymous data provided by ISPs. But, how representative are the participating ISPs? A movement toward more "open" ratings systems has brought companies like Quantcast into the arena, whose method combines audience-measured data (publishers place tracking code on their pages and share their data) with panel-based estimates. But for non-participating sites, Quantcast's evaluation can be extremely off base. Companies like Alexa rely on numbers obtained from people who've downloaded and use their toolbar -- which in most cases tend to be online marketing people and skews the data.
In addition, publishers often employ their own analytics programs or server logs to provide data for advertisers on unique visitors and page views. The smaller the publisher, the more you'll want to probe to find out exactly what analytics program they're using to measure traffic. Countless other articles have been written analyzing the discrepancies among the data provided by these various measurement systems. What a mess when you think about it!
For all this inconsistency, bear in mind, too, that many measurement tools are expensive. Large agencies might buy licenses to more than one system, but few others can afford to.
Considerations for Media Buyers
Is any one measurement system better than another? How does a savvy buyer mitigate the discrepancies? How can these tools -- warts and all -- be best utilized by media buyers to influence purchasing decisions? What new measurement tools are on the horizon due to the Web 2.0 phenomenon?
The media buyer needs a place to start assessing a site -- a ratings tool -- but must retain the perspective that none of these measurements are perfect. We've seen about a 10 percent discrepancy among various ratings systems for the same sites. Media buyers must rely on comparative research and trained judgment. Breaking the addiction to numbers can also be liberating. Many niche sites don't even register on the ratings companies' radars, but they might be more qualified to be included in the strategy than a larger site with a "channel" buy. You may pay more for a smaller audience, but achieve better brand impact or conversion rate due to the relevance of the product/service to the audience.
Most media buyers rely on an array of toolsets to help them research and assess sites. In addition to ratings tools, old-fashioned search ranks way up there as does reading media news and solicitations by diligent media reps. Create your own media databases so you can record smaller, less popular sites you find. Keep your eye out for social ratings tools, like those found on Balihoo or Technorati. Of course, rely, too, on your own historical experience with a site and a particular kind of advertiser.
Experience counts for a lot in this business.
Meet Your Favorite ClickZ Contributors
Many of ClickZ's leading expert contributors will be at ClickZ Live, the new online and digital marketing event kicking off in New York (March 31-April 3). Hear from the likes of: Jeremy Hull, Lisa Raehsler, Andrew Goodman, Bryan Eisenberg, Mathew Sweezey, Aaron Kahlow, Stephanie Miller, Simms Jenkins, Jeanne S. Jennings, Dave Hendricks and more!
A ClickZ expert columnist since 2005, Hollis Thomases (@hollisthomases) is president and founder of Maryland-based WebAdvantage.net, an online marketing company that provides results-centric, strategic Internet marketing services, including online media planning, SEO, PPC campaign management, social media marketing, and Internet consulting. Author of Twitter Marketing: An Hour a Day and an award-winning entrepreneur, Hollis is the Maryland 2007 SBA Small Business Person of the Year. Hollis speaks extensively on online marketing, having presented for ClickZ, the American Marketing Association, SES, The Newsletter and Electronic Publishers Association, The Kelsey Group, and the Vocus Worldwide User Forum. WebAdvantage.net's client list has included Nokia USA, Nature Made Vitamins, Johns Hopkins University, ENDO Pharmaceuticals, K'NEX Construction Toys, and Visit Baltimore. The agency was recognized as a "Small Giant" by the Greater Baltimore Tech Council and was chosen as a "Best Place for Business Women to Work" by "Smart Woman Magazine."
March 19, 2014