There are two possible responses to a marketing performance report - "Oh Yeah?" and "So What?" - so how should marketers react?
Any report, analysis, test, or other investigation of marketing performance exists (or should exist) to cause a response - some action taken or not taken based on the results. The set of possible actions is limitless.
For example, in response to a three-month drop in click-through rates, should the business refresh creative, minimize the number of competing calls to action in their messaging, or run more segmented tests to determine which group is the most unengaged? Each of these actions, or any combination thereof, would be reasonable.
Even though the possible responses are unlimited, in practice I've seen responses fall into one of two response types. Recognizing what type of response you're considering, which type is justified given the results and your current situation, and what result each type of response will likely cause will help you manage the unlimited set of possible responses and make a clear decision on next steps.
"Oh Yeah?" - The First Type of Response
You've received the results, but they're difficult to accept.
Say that you recently ran a test to optimize for engagement, and the results are telling you that your control version outperformed your test version, indicating that your control campaign activities achieve better engagement. You're response is, "Oh yeah?" In other words, you're wondering whether the control campaign actually resulted in better engagement or if there's something else going on.
If your key metric in this test is click-through rate, the "Oh Yeah?" response would be to use a different key metric to measure engagement (such as conversion rate) and rerun the test.
Another "Oh Yeah?" response would be to hypothesize that the lower click-through rate doesn't indicate lower engagement, but rather that the test version was not properly formatted for use on mobile devices. You'll run another test making sure that both versions are equally optimized for mobile.
Yet another "Oh Yeah?" response would be to say that the differences in click-through rate do indicate that the control version generates more engagement, but that the test groups were not properly designed, so the group that received the control was biased to click-through more. You would run another test making sure to properly create the control and test groups.
An "Oh Yeah?" response is saying that more information is needed before the hypothesis can be accepted.
"So What?" - The Second Type of Response
You accept that the control group performed better than the test group. Now what will you do?
Let's assume the control version had just one call to action while the test version had three. If the control version generated a higher click-through rate, so what? What will you do next?
One "So What?" response would be to run another test, this time with both the control and test version containing only one call to action, but testing the subject line treatment to see which subject line approach generated more overall conversions. You've accepted that one call to action gets more engagement than three, and now you're trying to find other ways of driving performance.
Another "So What?" response would be to pit one call to action against two, five, or 10. Notice that this is distinct from the "Oh Yeah?" response; even though you are still using one version with one call to action and another version with a different amount, you're not questioning the results of the original test. You've accepted those. Now you're trying something different.
Yet another "So What?" response would be to run the two versions against each other again, this time optimizing not for engagement, but rather for disengagement. You've accepted that the control version performs better in engagement. Now you're trying to see which one gets less disengagement.
Clarifying your feelings about the results can help you decide on next steps. Do you feel you need to know more before you can accept the hypothesis? That's an "Oh Yeah?" response. Do you accept the results, and now you're looking for what to do next? That's a "So What?" response.
This Year's Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) brings together the industry's leading practitioners and marketing strategists to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, this year's comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!
As one of StrongView's in-house marketing strategists, Justin Williams helps email marketers develop and implement strategic lifecycle marketing campaigns that are continually optimized to increase engagement and revenue. For the past five years, Justin has applied his expertise in email marketing, social media, web design, and other interactive marketing disciplines across a variety of industries, including retail, finance, media, and technology. In addition to founding his own consulting company, Justin has built go-to-market strategies for early-stage startups and worked with brands like Cisco, Qualcomm, and Geeknet. Justin holds a BA in cognitive science from the University of California at San Diego.
The Marketer's Guide to Customer Loyalty
Customer loyalty is imperative to success, but fostering and maintaining loyalty takes a lot of work. This guide is here to help marketers build, execute, and maintain a successful loyalty initiative.
The Multiplier Effect of Integrating Search & Social Advertising
Latest research reveals 68% higher revenue per conversion for marketers who integrate their search & social advertising. In addition to the research results, this whitepaper also outlines 5 strategies and 15 tactics you can use to better integrate your search and social campaigns.