Why Traffic and Analytics Matter to Media Buyers

 

With Google’s release of Google Analytics (formerly Urchin) in mid-November as a free Web analytics tool came a maelstrom of controversy and industry analysis, much of which has already been discussed within the ClickZ Network (here and here, for example). Some say this will hurt the Web analytics industry; others say it will only make it better for all industry players; still others focus on what I call the ugh factor. That is, if your site uses Google Analytics, the same company from which you probably buy advertising (Google AdWords) now has access to all your site’s data, including sales or conversion data.

Whatever “don’t be evil” mantra Google says it operates by, this data exposure still raises a red flag in my mind.

Web analytics and measuring site and Web traffic interest me as a media buyer, though. How much and how well do media-buying agencies take advantage of the information in such databases? How can we use this information to development and implement strategy on an ongoing basis?

First, let’s talk Web analytics solutions. Web analytics is hugely important to CRM (define) folks, but we ad agencies should also be mining data from Web analytic tools. Dennis Mortensen, COO of the Web analytics company Indextools, says, “The dirty secret of the Web analytics industry is that 80 percent of customers only use 20 percent of the features. Nevertheless, agencies are more demanding and generally get more use from this functionality.”

If we want to build a new strategy, for example, we should be examining what’s going on with traffic the site already gets. What’s the current level of traffic to the site, or the site section to which we want to drive traffic? From what referrers does that traffic come? Can we get media research direction from this information? What’s the site’s average conversion rate? Should this rate act as a benchmark against which we’ll measure the performance of our campaign?

During an ad campaign, use Web analytics to compare the campaign’s performance against site averages, audit conversion data provided by a third-party ad server, and gauge the impact of the ad campaign on the rest of the site.

The more robust analytic tools: Omniture, WebTrends, WebSideStory, Indextools, and ClickTracks, all offer excellent features agencies can take advantage of. Among them are filtering reports by client funnels, multivariate testing, integrated PPC (define) bid management, and agency-defined commission markups. Enhancements come out with each new version; WebSideStory is beta-testing a shopping search feed management system, for example.

In addition to reviewing Web analytics information, agencies should also draw on general Internet traffic analysis by such companies as Nielsen//NetRatings, comScore, and Hitwise. All provide bigger pictures than on-site traffic tools can. These companies reveal Web-wide user traffic patterns and trends so the agency can, for example, analyze increases, decreases, and specific site referral traffic to an advertiser’s competitor and how overall site traffic is affected by search and advertising campaigns unrelated to a direct click-through. Mining this data can identify sites to consider for a media plan.

These companies can provide upstream data (sites visited immediately before a user goes to a specific site) and downstream data (sites visited immediately after). Clickstream data analysis can also reveal expected trends in seasonal traffic.

Because large agencies are doing more with search, the latest developments in traffic measurement companies’ toolset revolve around search analysis. Seems like my ad/search agency convergence prediction is being validated.

Then you have Alexa. I used to take Alexa data with a grain of salt because, as its own site states, “Alexa could not exist without the participation of the Alexa Toolbar community.” The Alexa Toolbar is an application that, once installed, sends data back to Alexa about sites its users visit. Alexa’s user base tends to be online marketers and technology early adopters, so its data can be considered skewed.

Since it was launched in 1996, Alexa’s user base has reached enough of a critical mass (over 10 million downloads) to make its information worth looking at. Check out Alexa’s useful features, including Related Links, Movers & Shakers, Top Sites, Site Contact, Sites Linking In, Traffic Rank, and Traffic Trend Graphs.

My ideal future tool: an ad server plus analytics plus Web traffic data source.

 

Related reading

dna38_visability_of_display_ads
John Lewis tops most shared videos of 2016 list
snapchat-logo
dog tenenbaum
<