AnalyticsThe growing culture of marketing experimentation

The growing culture of marketing experimentation

Marketers are turning to new methods to analyze the effectiveness of their campaigns, and they are getting more accurate results more quickly with incrementality measurement.

30-second summary:

  • The influence of data and analytics on marketing has exploded in recent years, but previous marketing measurement tools fail to provide an accurate and complete picture of what was effective and what wasn’t.
  • To get that accurate picture, marketers are turning to experimentation to measure their results.
  • Experimentation and incrementality measurement provide a clearer and more accurate picture of what works and how effective it is.
  • Retailer Soft Surroundings turned to experimentation to optimize its Facebook ad budget.
  • A culture of experimentation requires an organizational commitment to constantly test assumptions and validate long-held beliefs against carefully designed tests.

The role of data, measurement and analytics in marketing has been growing in influence and impact for the past two decades. More recently, experimentation has re-emerged as an important marketing measurement tool.

This re-emergence is being driven in part by privacy concerns, which have made it increasingly difficult to collect cohesive data and apply attribution algorithms at the user level.

More broadly it is a product of marketer’s desire to actively test hypotheses in-market using experimentation to create best of breed marketing programs.

Building this discipline really requires only two things: (1) recognizing that any measurement or anlaytics program is incomplete without active experimentation and (2) understanding how to build and deploy testing against key hypotheses.

In practice, it takes commitment not to take for granted answers from Google Analytics, vendor reporting, lift studies, media mix modeling or multi-touch attribution.

Whether those methods agree, or more importantly when they disagree, organizations should consider those results as a starting point to develop key questions and hypotheses for experimentation programs.

Most marketers start with the same questions

Marketers consistently wonder about the same things:

  • Are lower funnel tactics like branded search and retargeting producing incremental conversions?
  • Are there synergies between marketing vehicles?
  • Is marketing more effectively deployed against existing customers or new customer acquisition?
  • Which prospecting tactics are truly growing my new acquisitions?
  • Which customer marketing tactics add value, and do I need all of them?

These are foundational questions for any brand, and there are effective ways to build out an experimental learning agenda against them. When first building out an experimental program, start by implementing incrementality testing at the channel level for key marketing tactics.

Based on spend, strategic priority or current performance assessments, brands can prioritize designing and implementing incrementality tests for key marketing channels such as retargeting, social, paid search or direct mail/catalog.

Limitations of lower funnel attribution

Each marketing channel requires a carefully designed approach to ensure that audiences are segmented into test and control groups. Marketing must then be withheld from the control group to serve as a baseline for measuring incrementality within the test group.

Often these channel-level tests confirm long-held suspicions, providing the in-market data needed to reallocate budgets.

Many marketers suspect that lower funnel retargeting programs are “stealing” credit by attaching themselves to converting pathways, and incrementality testing quantifies this effect and informs optimal budget levels for these programs.

Soft Surroundings, a fashion and beauty retailer, recently deployed retargeting incrementality testing with the hypothesis that they were overspending on retargeting.

Extensive testing revealed that was indeed the case, allowing the team to immediately reduce the retargeting budget by 30% and allocate over $120,000 per month to better performing tactics without seeing any meaningful drop-off in retargeting or site conversion.

Getting a clearer picture of Facebook prospecting

Sometimes incrementality testing uncovers opportunities that other measurement approaches have not yet identified.

I recently worked with a retailer that measured Facebook advertising with a combination of Google Analytics and Facebook-reported metrics restricted to a one-day click-through and a one-day view-through window.

Incrementality testing discovered that while these methods were reasonably accurate for valuing Facebook as a whole, they significantly underreported the impact of Facebook prospecting tactics.

Active incrementality testing showed that view-through conversions not tracked by Google Analytics and outside of their one-day view-through window drove incremental and previously uncredited conversions. Based on this updated view, the company reallocated funding to prospecting.

Experimental learning doesn’t end with these channel-level experiments. Often these set the foundation by identifying opportunities for more sophisticated experimental design.

For Soft Surroundings, incrementality testing identified that Facebook prospecting was driving incremental new customers well below internal acquisition targets.

Rather than increase spend across the board and wait and see, Soft Surroundings used an experiment to explore scale at the ad set level for all Facebook prospecting ad sets performing below acquisition targets.

The scale test was designed to simulate elevated spend, frequency and audience penetration for each of these ad sets against a smaller subset of their prospecting audiences.

Over a few weeks, they scaled to 1.5x, 2x, 3x and more at the ad set level to effectively draw out the diminishing returns curve for each set.

Based on scale testing conducted in September, Soft Surroundings was able to increase Facebook prospecting budgets by 61% in October while only seeing at 22% increase in cost per acquisition, well within targets.

Without active test design the same budget increases may have taken many months to achieve while the team deployed smaller budget increases and waited to observe their impact on acquisition cost.

AARP’s incrementality experimentation

In another example, AARP used incrementality findings at the channel level to actively test whether branded paid search budgets would be better spent on paid social.

This radical shift was viewed with skepticism outside the marketing organization, and the team wanted to verify that its channel-level observations held as they shifted to paid social.

A carefully design geo-based experiment confirmed the hypothesis but also uncovered additional findings. There was more upside on paid social, but this was best accomplished with mild (rather than major) cuts to paid search due to synergies between the channels.

Geos with mild cuts to paid search and large increases in paid social showed double-digit topline acquisition growth while geos with major cuts to paid search saw a 19% efficiency decrease in paid social channels.

The net takeaway was that there were some cost savings to be had in paid search, but the major upside opportunity was in securing incremental budgets for paid social.

There is no copy-paste formula to build out a culture of experimentation nor is there a single learning agenda that is a fit for every brand. A culture of experimentation requires an organizational commitment to constantly test assumptions and validate long-held beliefs against carefully designed tests.

An impactful learning agenda requires input from executives, marketers and data scientists and must be revisited regularly in order to continue to drive value, but the upside is real and impactful.

Applying a test, learn, grow philosophy to marketing is the fastest way to drive change that meaningfully affects the bottom line.

Nick Stoltz is chief operating officer at Measured. He is an experienced growth hacker that creates success for growing companies, teams and technology products through a combination of leadership, strategic vision, marketing, operations, product commercialization, and business development.

Subscribe to get your daily business insights

Whitepapers

US Mobile Streaming Behavior
Whitepaper | Mobile

US Mobile Streaming Behavior

5y

US Mobile Streaming Behavior

Streaming has become a staple of US media-viewing habits. Streaming video, however, still comes with a variety of pesky frustrations that viewers are ...

View resource
Winning the Data Game: Digital Analytics Tactics for Media Groups
Whitepaper | Analyzing Customer Data

Winning the Data Game: Digital Analytics Tactics for Media Groups

5y

Winning the Data Game: Digital Analytics Tactics f...

Data is the lifeblood of so many companies today. You need more of it, all of which at higher quality, and all the meanwhile being compliant with data...

View resource
Learning to win the talent war: how digital marketing can develop its people
Whitepaper | Digital Marketing

Learning to win the talent war: how digital marketing can develop its peopl...

2y

Learning to win the talent war: how digital market...

This report documents the findings of a Fireside chat held by ClickZ in the first quarter of 2022. It provides expert insight on how companies can ret...

View resource
Engagement To Empowerment - Winning in Today's Experience Economy
Report | Digital Transformation

Engagement To Empowerment - Winning in Today's Experience Economy

1m

Engagement To Empowerment - Winning in Today's Exp...

Customers decide fast, influenced by only 2.5 touchpoints – globally! Make sure your brand shines in those critical moments. Read More...

View resource