Social Media Benchmarks: Realities and Myths

Benchmarks to avoid and others to embrace.

We’ve talked about the ROI for social marketing and the new metrics like social intensity that should be used to measure how modern consumers respond to advertising and marketing. But many clients still ask about benchmarks. They ask, “What are good CTRs (define), CPCs (define), CPMs (define), etc. so I know how my programs stack up?”

Well, there’s good reason those benchmarks are hard to find. Lacking a reliable source, I ran my own analysis over the last three years and came up with many eye-opening results. Let’s look at those benchmarks and observations, and also do some myth busting for advertisers and their agencies.

Time on Site Is a Bad Metric

Time on site is still often referred to as if it’s a useful metric of a program’s success. However, this metric is quite possibly one of the worst indicators of “engagement” because so many factors can increase or decrease it without having anything to do with a user’s interest.

For example, if someone opens a page in a Web browser, then takes a 30-minute phone call before coming back to the page, that “time on site” is artificially higher. Or if someone can’t find something and spends some frustrating minutes looking around on the site, the extra time spent doesn’t indicate greater interest. In fact, the person may get increasingly annoyed at the site and never come back.

Other actions, such as adding video to a site, also increase the average time on site simply because users are spending time finding a video or watching it. It doesn’t mean they’re any more or less inclined to complete an action desired by the advertiser, like make a purchase.

Other factors decrease time on site artificially — for example, the person found what they were looking for immediately through site search. The person was very satisfied and left the site after completing a mission.

And finally, if a someone only stays on one page (e.g. an all-Flash Web site) many analytics packages can’t even report time-on-site because there isn’t a “second page” that can be used to calculate the end time.

Time on site was a useful construct in the early days of the Internet when the majority of the sites were content sites, such as newspapers. When someone spent more time on the site, it implied they were reading content that they found useful and interesting. This is no longer an accurate or useful indicator of interest.

Takeaway: there are no benchmarks for this metric. Drop it entirely as a success metric.

Search, Display, and Contextual Ads

Three types of ads turned up three different results:

  • Search ads yielded click rates in the 1 percent range (1 click in 100 impressions served).
  • Display ads came in at the 0.1 percent range (1 click in 1,000 impressions).
  • Content/contextual ads, the ones served next to content that was supposed to be contextual based on the words it contained, came in at the 0.01 percent range (1 in 10,000).

These order-of-magnitude estimates are taken from well-optimized ads in each of their respective formats during campaigns averaged over long periods of time. One campaign has been running continuously since 2007. These benchmarks illustrate the relative efficacy of the three ad types.

Search ads are arguably more relevant and efficient because the advertiser doesn’t pay until the user clicks. Furthermore, the user is more likely to click on an ad served in response to the keyword or words they’re searching on — it’s timely and related to what they’re looking for.

Contrast this to display ads or content ads that are served next to content that the user is reading, using, or consuming. The Web site visitor is there for the content and is usually pre-conditioned to avoid looking at the top of the page and right-hand margin because ads are known to appear in those areas.

Even contextual ads served up right in the middle of the content — e.g. double-underlined words — are usually more annoying than helpful. Some would argue the clicks on the ad units may be from people trying to close the annoying ad blocking them from reading the content they were after in the first place.

Banner ads on social networks yield even worse CTRs. People are on social networks to socialize with friends. And while that activity generates an ungodly number of page views, the users simply aren’t looking at the ads.

The Scourge of Social Ads

Compared to banner ads served on social networks, social ads in their current forms border on inappropriate, offensive, or at the very least misleading. Writer and speaker David Berkowitz has had a well-documented run in with social ads that he wrote about here.

One time, my picture was used next to a Visa Business Network ad, prompting a friend of mine to sign up for it, thinking I endorsed it. (I joined the group to see what it was about, but never went back to use it.) We both later quit the group because he felt duped and I felt used.

Other questionable practices of social ads include intermingling with people’s statuses or content streams. Brian Morrissey writes,”In the stream, ads masquerade as content” and adds, “Here come the brand social marketing bribes.”

Despite the potential for duping people into clicking or signing up for something, social ads’ CTRs remain a rounding error to zero. So, some clever application or widget makers are resorting to citing only the vast reach that such campaigns provide. They’re hoping that big advertisers will still buy the notion of reach and frequency like they do when buying TV spots.

Advertisers are smarter than that and will also stay far away from deliberate attempts at duping potential customers just to get clicks.

Benchmarks: Reality Versus Myths

CPC is a useful benchmark for comparing various types of advertising. Again, using order-of-magnitude comparisons, my analysis showed: PayPerPost marketing yielded $1 to $2 CPCs; AdWords marketing yielded $0.50 CPCs; and StumbleUpon ads yielded $0.05 cost per clicks. (See social media benchmarks for more detail.)

The quality of visitors arriving on your site from various ad types varies widely, but may be a useful construct to compare the efficacy and cost efficiency of different ad types or ad channels. For example, visitors from paid ads (banner ads, rich media ads, home page takeovers on massive portal sites) mostly bounce (i.e. leave right away); those who stay, stay a very short amount of time (e.g. five seconds); and they view no more than the one page they landed on.

However, visitors who arrived as a result of searching for something have half the bounce rate, stay twice as long, and view three times more pages on average than the previous group. Visitors from paid search resemble the former group in terms of low “quality.”

Finally, the myth of “view throughs” deserves to be debunked. Because CTRs are so dismally low, ad networks want to claim credit for type-ins (people going to an advertiser’s site by typing the URL instead of clicking on an ad).

This is called a “view through” and ad networks want these to be given credit for showing the ad somewhere on their network. A view through occurs if the user happens across any site in an ad network, a banner is served, the user receives a cookie, and then during an arbitrary time period negotiated with the advertiser, the ad network claims credit for the user visiting the advertiser’s Web site even if the user typed in the address manually.

Bottom line: A view through is a fictitious metric. If the click or visit isn’t directly attributable, it shouldn’t be attributed.

Recommendations: How Should You Spend Your Money?

  • Spend it on search ads, display ads, and content ads — in that order.
  • Compare apples to apples by calculating CPC across ad types and ad channels.
  • Stop using misleading or irrelevant metrics such as time on site, view-through rate, etc.

Start using new metrics that indicate user interest and intent, such as:

  • On-site search. Are your visitors looking for something specific on your site?
  • Time they spend with a piece of content — not just on the site in general. Are they spending time with a specific piece of content?
  • Site bounce rate (lower is better). Are people on your site finding that they were looking for?
  • Percentage of repeat visits. Do users find your site useful or valuable enough for them to come back?

Subscribe to get your daily business insights

Whitepapers

US Mobile Streaming Behavior
Whitepaper | Mobile

US Mobile Streaming Behavior

5y

US Mobile Streaming Behavior

Streaming has become a staple of US media-viewing habits. Streaming video, however, still comes with a variety of pesky frustrations that viewers are ...

View resource
Winning the Data Game: Digital Analytics Tactics for Media Groups
Whitepaper | Analyzing Customer Data

Winning the Data Game: Digital Analytics Tactics for Media Groups

5y

Winning the Data Game: Digital Analytics Tactics f...

Data is the lifeblood of so many companies today. You need more of it, all of which at higher quality, and all the meanwhile being compliant with data...

View resource
Learning to win the talent war: how digital marketing can develop its people
Whitepaper | Digital Marketing

Learning to win the talent war: how digital marketing can develop its peopl...

2y

Learning to win the talent war: how digital market...

This report documents the findings of a Fireside chat held by ClickZ in the first quarter of 2022. It provides expert insight on how companies can ret...

View resource
Engagement To Empowerment - Winning in Today's Experience Economy
Report | Digital Transformation

Engagement To Empowerment - Winning in Today's Experience Economy

1m

Engagement To Empowerment - Winning in Today's Exp...

Customers decide fast, influenced by only 2.5 touchpoints – globally! Make sure your brand shines in those critical moments. Read More...

View resource