How Heatmaps Can Help

Poynter's eyetracking research could help validate interactive advertising.

Last December, The Poynter Institute observed nearly four dozen individuals as they navigated their way through a number of Web sites, tracking what they call “user eyeflow” on those sites. Poynter was primarily looking at how users interacted with news sites, but it also looked at how users interact with ads before they click on (or away from) them.

What I like about Poynter’s observations is they can help kick-start an industry conversation about another type of validation for interactive advertising. I applaud the effort and encourage you to familiarize yourself with Poynter’s observations.

The study mocked up news Web sites and article pages that included advertising ranging from the standard 468 x 60 to in-content 300 x 250s, skyscrapers, pop-ups, and text links. Forty-six participants navigated the sites while researchers monitored their eyeflow on the pages. They then aggregated the results of where on the pages users spent time viewing to create heatmaps that show users’ concentration of attention. The heatmaps plot what Poynter calls heatspots. Heatspots are aggregate representations of all participants’ eye fixations on a page. Check out this heatmap. It’s really cool stuff.

From the heatmaps, Poynter was able to observe (for this study at least) that some ads draw a user’s attention more than others. Attention was defined by two factors: the user’s eye being drawn to the ad unit and time spent viewing the ad.

Did I mention the heatmaps are really cool?

A few specific observations:

  • People tune out certain ad types, going so far as to avoid them. Heatmaps indicate the 468 x 60 is all but ignored, probably as many of you suspected. This type of testing may actually prove it.
  • Borders around ads appear to keep attention out of the ad unit. Although some publishers require ad borders, these may need to disappear to make ads more effective.
  • Ads that blend into content work. Text ads work especially well. If it looks like content, people look at it.
  • Size matters. In general, larger ads perform well. That doesn’t mean the largest ads performed best. Skyscrapers performed well, with 44 percent of the audience seeing them. Thirty-eight percent saw half-page ads.
  • In-content 300 x 250 ads performed even better than skyscrapers and half-page ads, with 56 percent seeing them.
  • Page placement plays a clear role in an ad’s effectiveness. Following the typical user navigation pattern, ads that appear in the upper left corner were seen more often.
  • Pop-ups are seen by 70 percent of those exposed to them. Yet they were generally closed very quickly (within three seconds) or ignored.

The study was more anecdotal than methodical in nature. There are plenty of points one could take issue with. Creativity’s role in ad effectiveness, for instance, seemed to be an area that requires a more disciplined testing methodology. There were conflicting observations regarding color and animation.

At my agency, creative is very important. I’d venture to say great creative plays a much more important role in campaign performance than this study suggests.

Many of the observations are fairly intuitive. Yet if we can substantiate “observations” and turn them into “findings,” we stand to become a little smarter. This kind of testing platform could really drive the media placement and creative approach we take as advertisers, agencies, and publishers.

Poynter’s study should facilitate conversation in the industry about how to use the approach to further validate interactive advertising effectiveness. Most advertisers, agencies, and publishers are armed with case studies illustrating the medium’s effectiveness from a direct response perspective.

With the help of the Interactive Advertising Bureau (IAB) and the Online Publishers Association (OPA), the industry has done a fantastic job promoting studies tracking interactive advertising’s branding effectiveness. Eyeflow tracking and heatmaps would be another great (and sexy) way to validate effectiveness.

Going one step further, it would be nice to use this same testing platform as a measure of effectiveness across multiple media. Let’s do a cross-media eyeflow/heatmap test like this one, with a branding study layered on top.

As advertisers demand increased accountability, this approach stands to become another way to prove performance and to make us smarter interactive marketers.

Check out The Poynter Institute’s observations and heatmaps. Let me know what you think.

Subscribe to get your daily business insights

Whitepapers

US Mobile Streaming Behavior
Whitepaper | Mobile

US Mobile Streaming Behavior

5y

US Mobile Streaming Behavior

Streaming has become a staple of US media-viewing habits. Streaming video, however, still comes with a variety of pesky frustrations that viewers are ...

View resource
Winning the Data Game: Digital Analytics Tactics for Media Groups
Whitepaper | Analyzing Customer Data

Winning the Data Game: Digital Analytics Tactics for Media Groups

5y

Winning the Data Game: Digital Analytics Tactics f...

Data is the lifeblood of so many companies today. You need more of it, all of which at higher quality, and all the meanwhile being compliant with data...

View resource
Learning to win the talent war: how digital marketing can develop its people
Whitepaper | Digital Marketing

Learning to win the talent war: how digital marketing can develop its peopl...

2y

Learning to win the talent war: how digital market...

This report documents the findings of a Fireside chat held by ClickZ in the first quarter of 2022. It provides expert insight on how companies can ret...

View resource
Engagement To Empowerment - Winning in Today's Experience Economy
Report | Digital Transformation

Engagement To Empowerment - Winning in Today's Experience Economy

2m

Engagement To Empowerment - Winning in Today's Exp...

Customers decide fast, influenced by only 2.5 touchpoints – globally! Make sure your brand shines in those critical moments. Read More...

View resource