Case Study: Subject Line Testing, but Which One Won?

Sometimes it's difficult to figure out what to test to improve performance. But sometimes the hard part comes after the test, in determining the winner.

Sometimes it’s difficult to figure out what to test to improve performance. But sometimes the hard part comes after the test, in determining the winner. Case in point: a recent subject line test I undertook for a client.

The publication is an email newsletter, part of a paid subscription to a website. Only paying subscribers receive the newsletter; its purpose is to make readers aware of content that might be of interest to them on the site and motivate them to click through and view it.

So the goal of this newsletter is to get people to click through; we’re not looking for revenue to be generated directly from the email. There’s a feeling that people who click through on a regular basis are more likely to renew their subscriptions. It’s logical, although I’ve not seen any hard data to support it for this product. So while the goal of the newsletter isn’t to generate revenue, the belief is that by getting people to engage with the newsletter we are increasing the likelihood that they will renew.

This round of testing involved the subject line. The control subject line was “Subscriber Newsletter.” The formula we used to generate the test subject lines was to include a key theme unique to each issue of the newsletter in the first 25 characters of the subject line. The hypothesis is that by featuring a key theme prominently in the subject line we’ll increase opens and, as a result, clicks to the site.

Notice that I said “test subject lines” – with the lines being plural. Because of the list size (roughly 20,000 total), we need to test over two or three sends in order to get results that are statistically significant. Then we compile the overall results to determine the winner of the test.

So how’d we do? Here’s where the debate begins…

010713-chart

The control subject line generated higher open and unique click-through rates. But the test subject line bested the control in total click-through rate as well as both unique and total click-to-open rates.

Let me quickly explain the difference between the four click-through metrics we’re looking at:

  • Unique click-through rate (unique CTR). Unique means that a maximum of one click per email is considered. So if a person clicks on three links in the email, she is still only counted as one click. Here unique clicks are divided by the quantity of email assumed delivered (that is the quantity sent minus bounces). This is the most common click-through metric used to analyze performance.
  • Total click-through rate (total CTR). Total means that all clicks are included in the calculation, so that person who clicked on three links would be counted as three clicks. Once again, the total number of clicks are divided by the quantity of email assumed delivered.
  • Unique click-to-open rate (unique CTOR). Here the clicks are divided by the number of opens, instead of the quantity of email assumed delivered. As above, the unique CTOR includes only one click per email; it’s also based on a maximum one open per email. This is the second most common click-through metric used to analyze email performance.
  • Total click-to-open rate (total CTOR). Calculated just as unique CTOR is, with the exception that total clicks, meaning the possibility of more than one click per email, are used. Opens remain one per email.

It’s standard operating procedure to focus on the unique CTR and CTOR metrics, but in some instances the total CTR and CTOR should also be considered. That’s the case with this email product. The management team focuses heavily on the amount of traffic driven to the site each month from the newsletter. So the total number of clicks equals the total number of visits (not unique visitors, but unique visits), and therefore we look at these.

So, which subject line, control, or test would you call the winner? Let’s run down what the metrics are telling us:

  • The control subject line caused more people to open the email (open rate).
  • The control subject line caused more people to click on at least one link in the email (unique CTR).
  • The test subject line caused people to click on more links in the email than the control (total CTR), suggesting they explored the email more fully because of the themed subject line.
  • Of those that opened the email based on the test subject line, more of them clicked on at least one link in the email (unique CTOR), suggesting that they were more intrigued by and interested in engaging with the content.
  • Of those that opened the email based on the test subject line, more of them clicked on more links overall than those that received the control (total CTOR), suggesting that they spent more time and were more engaged and interested in interacting with the content.

One more data point, looking at raw numbers, not metrics: we mailed slightly fewer of the emails with the test subject lines (-1 percent), but those versions generated slightly more total clicks (+2 percent) than the control.

So, which would you say won: the control or the test subject line?

It comes down to a key question: are you looking to maximize website visits or unique website visitors? Do we want slightly more people, even if they are less engaged (click less), or slightly fewer people who are much more engaged (click more)?

For those within the business unit focused on maximizing website traffic, the raw data on total clicks was their key indicator – more total clicks even though fewer emails were sent. They saw the test subject line as the winner.

For those on the team focused on maximizing unique website visitors, the unique CTR told the story; they sided with the control as the winner.

So what would you do?

In the end, we stuck with the control subject line – and are now making plans for another round of subject line testing to see if we can get it all: more people opening and generating more unique and total clicks, from a subject line that is more descriptive than just “Subscriber Newsletter.”

I’ll keep you posted.

Until next time,

Jeanne

Subscribe to get your daily business insights

Whitepapers

US Mobile Streaming Behavior
Whitepaper | Mobile

US Mobile Streaming Behavior

5y

US Mobile Streaming Behavior

Streaming has become a staple of US media-viewing habits. Streaming video, however, still comes with a variety of pesky frustrations that viewers are ...

View resource
Winning the Data Game: Digital Analytics Tactics for Media Groups
Whitepaper | Analyzing Customer Data

Winning the Data Game: Digital Analytics Tactics for Media Groups

5y

Winning the Data Game: Digital Analytics Tactics f...

Data is the lifeblood of so many companies today. You need more of it, all of which at higher quality, and all the meanwhile being compliant with data...

View resource
Learning to win the talent war: how digital marketing can develop its people
Whitepaper | Digital Marketing

Learning to win the talent war: how digital marketing can develop its peopl...

2y

Learning to win the talent war: how digital market...

This report documents the findings of a Fireside chat held by ClickZ in the first quarter of 2022. It provides expert insight on how companies can ret...

View resource
Engagement To Empowerment - Winning in Today's Experience Economy
Report | Digital Transformation

Engagement To Empowerment - Winning in Today's Experience Economy

2m

Engagement To Empowerment - Winning in Today's Exp...

Customers decide fast, influenced by only 2.5 touchpoints – globally! Make sure your brand shines in those critical moments. Read More...

View resource