The next time a test is inconclusive and/or doesn't seem to make sense, do a retest.
In my last column I wrote about 15 types of tests you could do with an email campaign. Subject line, as you may recall, wasn't at the top of my list of things to test. I believe it's more frequently tested because it's perceived to be the easiest to execute.
That said, there are times when it makes sense to do a subject line test - or retest.
Earlier this year I wrote about a subject line test I did with a client. The results were inconclusive. We decided to stick with the control but added it to our schedule for a retest.
Whenever a test result is inconclusive or seems to defy logic, it's never a bad idea to tee up a retest for sometime in the future. This was the case here. You might get the same result. But you might get something different.
As promised, here's a brief overview of the retest and the latest results.
Background: we performed this test with an email newsletter; the goal is to drive traffic to a website. Since these people are already paying subscribers, the website content is available to them at no additional charge. So we're not looking to generate new revenue from this but we do want to drive people to use the content so that they'll be more likely to renew their subscription.
Once again, we scheduled the test over multiple sends, to try to guarantee we'd get statistically significant results (it's a small group; there are only about 40,000 subscribers total). Once again we pitted the control subject line ("Subscriber Newsletter") against subject lines that included a key theme in the first 25 characters, your subject line "prime real estate."
Note that the control is static - it is and has been, for years, the same for all sends. The test is a subject line formula, meaning that the actual words will change from issue to issue but that the formula stays the same. Until this test, I'd never seen a static subject line like this outperform a themed subject line for an email newsletter.
So how'd we do?
This time we had a clear winner.
The test bested the control in all four click-through metrics (see my earlier column for descriptions and how each was calculated).
We were especially excited to see the CTOR figures, which show that the test subject line did a better job at engaging readers and enticing them to explore and click on links in the email once they opened.
Just to be safe, we'll be doing a back-test (with "Subscriber Newsletter" as the test) sometime in the future to confirm our new results.
Moral of the story: the next time a test is inconclusive and/or doesn't seem to make sense, do a retest.
Until next time,
Jeanne Jennings is a recognized expert in the email marketing industry and managing director of digital marketing for Digital Prism Advisors. She has more than 20 years of experience in the email and online marketing and product development world. Jeanne's direct-response approach to digital strategy, tactics, and creative direction helps organizations make their online marketing initiatives more effective and more profitable. Digital Prism Advisors helps established businesses unlock significant growth and revenue opportunities in the digital marketplace; our clients learn to develop and implement successful digital strategies, leveraging data and technology to better meet bottom line goals. Want to learn more? Check out Jeanne's blog and Digital Prism Advisors.
US Consumer Device Preference Report
Traditionally desktops have shown to convert better than mobile devices however, 2015 might be a tipping point for mobile conversions! Download this report to find why mobile users are more important then ever.
E-Commerce Customer Lifecycle
Have you ever wondered what factors influence online spending or why shoppers abandon their cart? This data-rich infogram offers actionable insight into creating a more seamless online shopping experience across the multiple devices consumers are using.
October 13, 2015
1pm ET/ 10am PT
November 12, 2015