Sometimes it's difficult to figure out what to test to improve performance. But sometimes the hard part comes after the test, in determining the winner.
Sometimes it's difficult to figure out what to test to improve performance. But sometimes the hard part comes after the test, in determining the winner. Case in point: a recent subject line test I undertook for a client.
The publication is an email newsletter, part of a paid subscription to a website. Only paying subscribers receive the newsletter; its purpose is to make readers aware of content that might be of interest to them on the site and motivate them to click through and view it.
So the goal of this newsletter is to get people to click through; we're not looking for revenue to be generated directly from the email. There's a feeling that people who click through on a regular basis are more likely to renew their subscriptions. It's logical, although I've not seen any hard data to support it for this product. So while the goal of the newsletter isn't to generate revenue, the belief is that by getting people to engage with the newsletter we are increasing the likelihood that they will renew.
This round of testing involved the subject line. The control subject line was "Subscriber Newsletter." The formula we used to generate the test subject lines was to include a key theme unique to each issue of the newsletter in the first 25 characters of the subject line. The hypothesis is that by featuring a key theme prominently in the subject line we'll increase opens and, as a result, clicks to the site.
Notice that I said "test subject lines" - with the lines being plural. Because of the list size (roughly 20,000 total), we need to test over two or three sends in order to get results that are statistically significant. Then we compile the overall results to determine the winner of the test.
So how'd we do? Here's where the debate begins…
The control subject line generated higher open and unique click-through rates. But the test subject line bested the control in total click-through rate as well as both unique and total click-to-open rates.
Let me quickly explain the difference between the four click-through metrics we're looking at:
It's standard operating procedure to focus on the unique CTR and CTOR metrics, but in some instances the total CTR and CTOR should also be considered. That's the case with this email product. The management team focuses heavily on the amount of traffic driven to the site each month from the newsletter. So the total number of clicks equals the total number of visits (not unique visitors, but unique visits), and therefore we look at these.
So, which subject line, control, or test would you call the winner? Let's run down what the metrics are telling us:
One more data point, looking at raw numbers, not metrics: we mailed slightly fewer of the emails with the test subject lines (-1 percent), but those versions generated slightly more total clicks (+2 percent) than the control.
So, which would you say won: the control or the test subject line?
It comes down to a key question: are you looking to maximize website visits or unique website visitors? Do we want slightly more people, even if they are less engaged (click less), or slightly fewer people who are much more engaged (click more)?
For those within the business unit focused on maximizing website traffic, the raw data on total clicks was their key indicator - more total clicks even though fewer emails were sent. They saw the test subject line as the winner.
For those on the team focused on maximizing unique website visitors, the unique CTR told the story; they sided with the control as the winner.
So what would you do?
In the end, we stuck with the control subject line - and are now making plans for another round of subject line testing to see if we can get it all: more people opening and generating more unique and total clicks, from a subject line that is more descriptive than just "Subscriber Newsletter."
I'll keep you posted.
Until next time,
This Year's Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) brings together the industry's leading practitioners and marketing strategists to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, this year's comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!
Jeanne Jennings is a 20 year veteran of the online/email marketing industry, having started her career with CompuServe in the late 1980s. As Vice President of Global Strategic Services for Alchemy Worx, Jennings helps organizations become more effective and more profitable online. Previously Jennings ran her own email marketing consultancy with a focus on strategy; clients included AARP, Hasbro, Scholastic, Verizon and Weight Watchers International. Want to learn more? Check out her blog.
The Marketer's Guide to Customer Loyalty
Customer loyalty is imperative to success, but fostering and maintaining loyalty takes a lot of work. This guide is here to help marketers build, execute, and maintain a successful loyalty initiative.
The Multiplier Effect of Integrating Search & Social Advertising
Latest research reveals 68% higher revenue per conversion for marketers who integrate their search & social advertising. In addition to the research results, this whitepaper also outlines 5 strategies and 15 tactics you can use to better integrate your search and social campaigns.