MediaMedia PlanningHow Heatmaps Can Help

How Heatmaps Can Help

Poynter's eyetracking research could help validate interactive advertising.

Last December, The Poynter Institute observed nearly four dozen individuals as they navigated their way through a number of Web sites, tracking what they call “user eyeflow” on those sites. Poynter was primarily looking at how users interacted with news sites, but it also looked at how users interact with ads before they click on (or away from) them.

What I like about Poynter’s observations is they can help kick-start an industry conversation about another type of validation for interactive advertising. I applaud the effort and encourage you to familiarize yourself with Poynter’s observations.

The study mocked up news Web sites and article pages that included advertising ranging from the standard 468 x 60 to in-content 300 x 250s, skyscrapers, pop-ups, and text links. Forty-six participants navigated the sites while researchers monitored their eyeflow on the pages. They then aggregated the results of where on the pages users spent time viewing to create heatmaps that show users’ concentration of attention. The heatmaps plot what Poynter calls heatspots. Heatspots are aggregate representations of all participants’ eye fixations on a page. Check out this heatmap. It’s really cool stuff.

From the heatmaps, Poynter was able to observe (for this study at least) that some ads draw a user’s attention more than others. Attention was defined by two factors: the user’s eye being drawn to the ad unit and time spent viewing the ad.

Did I mention the heatmaps are really cool?

A few specific observations:

  • People tune out certain ad types, going so far as to avoid them. Heatmaps indicate the 468 x 60 is all but ignored, probably as many of you suspected. This type of testing may actually prove it.
  • Borders around ads appear to keep attention out of the ad unit. Although some publishers require ad borders, these may need to disappear to make ads more effective.
  • Ads that blend into content work. Text ads work especially well. If it looks like content, people look at it.
  • Size matters. In general, larger ads perform well. That doesn’t mean the largest ads performed best. Skyscrapers performed well, with 44 percent of the audience seeing them. Thirty-eight percent saw half-page ads.
  • In-content 300 x 250 ads performed even better than skyscrapers and half-page ads, with 56 percent seeing them.
  • Page placement plays a clear role in an ad’s effectiveness. Following the typical user navigation pattern, ads that appear in the upper left corner were seen more often.
  • Pop-ups are seen by 70 percent of those exposed to them. Yet they were generally closed very quickly (within three seconds) or ignored.

The study was more anecdotal than methodical in nature. There are plenty of points one could take issue with. Creativity’s role in ad effectiveness, for instance, seemed to be an area that requires a more disciplined testing methodology. There were conflicting observations regarding color and animation.

At my agency, creative is very important. I’d venture to say great creative plays a much more important role in campaign performance than this study suggests.

Many of the observations are fairly intuitive. Yet if we can substantiate “observations” and turn them into “findings,” we stand to become a little smarter. This kind of testing platform could really drive the media placement and creative approach we take as advertisers, agencies, and publishers.

Poynter’s study should facilitate conversation in the industry about how to use the approach to further validate interactive advertising effectiveness. Most advertisers, agencies, and publishers are armed with case studies illustrating the medium’s effectiveness from a direct response perspective.

With the help of the Interactive Advertising Bureau (IAB) and the Online Publishers Association (OPA), the industry has done a fantastic job promoting studies tracking interactive advertising’s branding effectiveness. Eyeflow tracking and heatmaps would be another great (and sexy) way to validate effectiveness.

Going one step further, it would be nice to use this same testing platform as a measure of effectiveness across multiple media. Let’s do a cross-media eyeflow/heatmap test like this one, with a branding study layered on top.

As advertisers demand increased accountability, this approach stands to become another way to prove performance and to make us smarter interactive marketers.

Check out The Poynter Institute’s observations and heatmaps. Let me know what you think.

Related Articles

Five ad tech upstarts to keep an eye on

AI Five ad tech upstarts to keep an eye on

2m Al Roberts
The State of Media Transformation

Digital Transformation The State of Media Transformation

2m Chris Camps
5G: The next great media disruption

Media 5G: The next great media disruption

2m Luke Richards
How brand advertisers are fighting ad fraud

Blockchain How brand advertisers are fighting ad fraud

3m Al Roberts
How QVC is managing to survive and thrive in the Amazon era

Ecommerce How QVC is managing to survive and thrive in the Amazon era

5m Al Roberts
What is intelligent content, and how can it future-proof your content marketing?

Content Marketing What is intelligent content, and how can it future-proof your content marketing?

5m Rebecca Sentance
How brands can integrate live video into their marketing strategy

Content Marketing How brands can integrate live video into their marketing strategy

5m Rebecca Sentance
Facebook goes after clickbait headlines - five tips to maintain reach

Content Marketing Facebook goes after clickbait headlines - five tips to maintain reach

6m Tereza Litsa