There are two possible responses to a marketing performance report - "Oh Yeah?" and "So What?" - so how should marketers react?
Any report, analysis, test, or other investigation of marketing performance exists (or should exist) to cause a response - some action taken or not taken based on the results. The set of possible actions is limitless.
For example, in response to a three-month drop in click-through rates, should the business refresh creative, minimize the number of competing calls to action in their messaging, or run more segmented tests to determine which group is the most unengaged? Each of these actions, or any combination thereof, would be reasonable.
Even though the possible responses are unlimited, in practice I've seen responses fall into one of two response types. Recognizing what type of response you're considering, which type is justified given the results and your current situation, and what result each type of response will likely cause will help you manage the unlimited set of possible responses and make a clear decision on next steps.
"Oh Yeah?" - The First Type of Response
You've received the results, but they're difficult to accept.
Say that you recently ran a test to optimize for engagement, and the results are telling you that your control version outperformed your test version, indicating that your control campaign activities achieve better engagement. You're response is, "Oh yeah?" In other words, you're wondering whether the control campaign actually resulted in better engagement or if there's something else going on.
If your key metric in this test is click-through rate, the "Oh Yeah?" response would be to use a different key metric to measure engagement (such as conversion rate) and rerun the test.
Another "Oh Yeah?" response would be to hypothesize that the lower click-through rate doesn't indicate lower engagement, but rather that the test version was not properly formatted for use on mobile devices. You'll run another test making sure that both versions are equally optimized for mobile.
Yet another "Oh Yeah?" response would be to say that the differences in click-through rate do indicate that the control version generates more engagement, but that the test groups were not properly designed, so the group that received the control was biased to click-through more. You would run another test making sure to properly create the control and test groups.
An "Oh Yeah?" response is saying that more information is needed before the hypothesis can be accepted.
"So What?" - The Second Type of Response
You accept that the control group performed better than the test group. Now what will you do?
Let's assume the control version had just one call to action while the test version had three. If the control version generated a higher click-through rate, so what? What will you do next?
One "So What?" response would be to run another test, this time with both the control and test version containing only one call to action, but testing the subject line treatment to see which subject line approach generated more overall conversions. You've accepted that one call to action gets more engagement than three, and now you're trying to find other ways of driving performance.
Another "So What?" response would be to pit one call to action against two, five, or 10. Notice that this is distinct from the "Oh Yeah?" response; even though you are still using one version with one call to action and another version with a different amount, you're not questioning the results of the original test. You've accepted those. Now you're trying something different.
Yet another "So What?" response would be to run the two versions against each other again, this time optimizing not for engagement, but rather for disengagement. You've accepted that the control version performs better in engagement. Now you're trying to see which one gets less disengagement.
Clarifying your feelings about the results can help you decide on next steps. Do you feel you need to know more before you can accept the hypothesis? That's an "Oh Yeah?" response. Do you accept the results, and now you're looking for what to do next? That's a "So What?" response.
Join the Industry's Leading eCommerce & Direct Marketing Experts in Chicago
ClickZ Live Chicago (Nov 3-6) will deliver over 50 sessions across 4 days and 10 individual tracks, including Data-Driven Marketing, Social, Mobile, Display, Search and Email. Check out the full agenda and register by Friday, Oct 3 to take advantage of Early Bird Rates!
As one of StrongView's in-house marketing strategists, Justin Williams helps email marketers develop and implement strategic lifecycle marketing campaigns that are continually optimized to increase engagement and revenue. For the past five years, Justin has applied his expertise in email marketing, social media, web design, and other interactive marketing disciplines across a variety of industries, including retail, finance, media, and technology. In addition to founding his own consulting company, Justin has built go-to-market strategies for early-stage startups and worked with brands like Cisco, Qualcomm, and Geeknet. Justin holds a BA in cognitive science from the University of California at San Diego.
IBM Social Analytics: The Science Behind Social Media Marketing
80% of internet users say they prefer to connect with brands via Facebook. 65% of social media users say they use it to learn more about brands, products and services. Learn about how to find more about customers' attitudes, preferences and buying habits from what they say on social media channels.
An Introduction to Marketing Attribution: Selecting the Right Model for Search, Display & Social Advertising
If you're considering implementing a marketing attribution model to measure and optimize your programs, this paper is a great introduction. It also includes real-life tips from marketers who have successfully implemented attribution in their organizations.
September 17, 2014
September 23, 2014
September 30, 2014
1:00pm ET/10:00am PT