If IAB and InterWatch both measure the same thing, then why are they so radically different?
In one corner is the Internet Advertising Bureau, whose quarterly study purposely lacks specific data to preserve the integrity of the publishers. In the other corner is the monthly InterWatch report (previously AdSpend), which contains very granular data. IAB stated $351.3 million was spent in online advertising in Q1 1998. InterWatch says $194.5 million.
But still with a $157 million difference between the two, which is more accurate? How can you use them given that big a variance? It comes down to two issues — methodology and what you need.
Macro vs. Micro
The IAB commissioned Price Waterhouse Coopers to conduct an ongoing study of internet advertising revenues, with the goal to provide an accurate barometer of internet advertising growth. Competitive Media Reporting conducts monthly research to determine media spending and revenue, covering specific publishers and advertisers.
They are different!
An analysis of each study (aggregated quarterly) since the first quarter of 1996, shows some interesting results. The two studies are, in fact, VERY different.
InterWatch has become a tool we must use because there are no better alternatives. Originally named AdSpend, it was the brainchild of James Kennedy at Caddis International in December 1995. Caddis sold AdSpend to Jupiter Communications in May 1996. Then Jupiter sold AdSpend to Competitive Media Reporting in January 1997 (where it is known as the InterWatch report). Its methodology appears to remain consistent, for the most part.
The IAB study has experienced a steadier foundation since its creation in 1996.
Interestingly, as the industry has grown, so has the disparity between the two reports. For example, InterWatch reported the 1996 ad industry at a size that was 83 percent of that reported by the IAB study. In 1997, however, they reported an industry figure at 60 percent of the IAB study.
And the gap continued to widen last year. In the latest quarter of 1998, InterWatch reported only 55 percent of the IAB figure.
Collection Methodologies Differ
Price Waterhouse Coopers sticks to good ol’ mail surveys for its study. Surveys may be a little primitive, but they sure are effective.
The IAB study is based, then, primarily on historical figures reported directly from publishers (keep in mind, IAB members make up much of the surveyed audience, so there could be some inflated figures). InterWatch prefers to use a multiple-tiered approach, building the numbers from the ground-up. InterWatch makes estimates based on banner impressions and rate cards, then much like Price Waterhouse Coopers, uses public information to check and adjust the data. As a final step, InterWatch conducts personal interviews with publishers.
|
IAB
|
InterWatch
|
Step 1
|
Quantitative survey mailing
|
Obtains rate cards and other public data.
|
Step 2
|
Identifies non-participating companies and applies a conservative estimate based on public sources
|
Automatically and manually collects site data. Multiplies ad banners by rate cards
|
Step 3
|
|
Uses public information to check and possibly adjust rate card rates closer to street value
|
Step 4
|
|
Conducts personal interviews with web publishers and obtains audit statements.
|
Source: IAB, InterWatch
|
Whereas the IAB study is based on historical, factual data, its collection methodology leaves some room for human error and intervention in reporting accurate, objective numbers. InterWatch is able to report more independent figures by not relying primarily on publisher-reported numbers. But InterWatch’s estimating methodology seems to be an impossible task on the surface — counting every single impression when counting standards have not been established.
‘Nuts and Bolts’ Features Differ
The reports differ greatly in their presentation. IAB aggregates data where the level of reporting granularity reaches only industry classifications. InterWatch reports on specific advertisers and publishers.
But one similarity exists — where InterWatch says barters are “uncommon,” the IAB reports that 2 percent of ads are bartered out.
|
IAB
|
InterWatch
|
Sites Measured
|
1200 (generate more than $5000 in online ad revenue per month)
|
330 (claims to account for 90% of all spending; sites selected based on ability to attract national advertisers)
|
International
|
Presumably
|
No
|
Sites Reported
|
No
|
Yes
|
Advertisers Reported
|
No
|
Yes
|
Trend Analysis
|
Yes
|
No
|
Industry Classifications
|
Yes
|
Yes
|
Conducted by
|
Price Waterhouse Coopers
|
Competitive Media Reporting
|
Ad Forms Measured
|
All forms – web ads, banner, email, online services, push
|
Standard banner and interstitials, keyword advertising and sponsor- ships
|
Approximate delivery time of report from period of measurement
|
4 months
|
8 months
|
Source: IAB, InterWatch
|
The biggest differences that may affect overall reported numbers are in sites measured and ad forms measured.
IAB measures 1200 sites, or 200 companies. InterWatch measures 330 sites. And InterWatch measures only banner ad transactions. The IAB measures pretty much every ad on the internet.
Other differences to note include geography measured, and delivery of the report. InterWatch does not measure international ad activity. Although the IAB report doesn’t clarify, I would guess it does measure international activity.
And finally, the biggest crux for all InterWatch customers — delivery. InterWatch is notorious for being six to eight months late on delivering reports. By the time you get the report, the data lacks the timeliness to be useful or actionable. All this in internet time!
The Upshot
So back to the original question: Which is more accurate? Well, neither or both, depending on your perspective.
Both reports offer distinct differences and advantages. Both are right in their own respects, they just use different routes to get to the same end point. In fact, I like both and use both. And there are ways to use them together for your own non-statistical estimation purposes.
My advice: Stick to the IAB report as a macro-level, industry barometer. Use the InterWatch report for micro-level estimates of specific advertisers and publishers.
The best way to make them work together is this: Calculate a fudge factor. As a rule of thumb, I would be inclined to multiply InterWatch data by 150 percent to arrive at an accurate figure (since the InterWatch/IAB gap has declined from 83 to 55 percent over two years).
All in all, media reporting and competitive spending on the ‘net is still evolving and developing much as the medium itself.