Lather, Rinse, and Repeat: Learning From Email Marketing Testing

I once worked with a marketer who believed that testing was a waste of time. “What could we possibly do differently when we learn that test cell A responded to a higher offer than test cell B?!” he would argue. I can hear all of you out. Some are groaning. Of course testing is hard and if it’s not done well (a sophisticated task), then you won’t get much out of it and the effort is wasted. You are left with a perception that testing is worthless. Others may be cheering: someone finally said it out loud! Is it true in your experience that most email testing results are not that helpful?

I believe in testing and optimization, and integrated marketing technology makes it possible for all of us to be more successful in our efforts. Over the past few years, a large European retailer tried some basic approaches and, while the road was quite twisted and filled with potholes, they did learn and improve their program. So perhaps there is some hope for the wary, and weary, among us.

This retailer, like all of us, wanted to make more revenues with email marketing. They knew there were two ways to do this…grow the file and improve response rates. They at first thought they could just buy a list and start broadcasting email messages. They tested that on 5,000 records and found the response rates were less than a third of a percent (0.28 percent) and the spam complaints were higher than 4 percent, putting them on two blacklists and dropping their inbox placement rates down to 45 percent. (A half a percent (0.05 percent) spam complaint rate will start to get you blocked at major mailbox providers all over the world.) So that wasn’t so successful. And, it took six months to repair the sender reputation and get removed from the block lists.

Next, they tried to encourage existing customers to buy more. They tested sending double frequency to half the audience and sent promotions four times a week. This caused revenues to pop (up 22 percent!), but it also generated higher spam complaints (up to 0.5 percent). So they quickly stopped that, because the sender reputation risk was too high.

Running out of ideas, they tried to encourage customers to forward email promotions to other people. They offered a 2:1 coupon to a test cell that allowed two people (you and a friend) to save up to 50 percent each on a spring outfit. The concept was well received and lots of people invited friends to take it up with them. That specific promotion did well and they were pleased. However, the retailer automatically added all of the “friends” to the email file and started to send them two promotions a week, even if the friend did not make a purchase or actively opt in for email from this brand. Not only did their spam complaints go up again, but they found that a few long-time customers were sending flame (angry) messages to customer service and even to the president of the company. The brand damage caused the email marketing team great shame and challenges internally.

Embracing that learning, now this retailer takes a much more customer-centric approach. They honor permission carefully, take time to customize content, and test things like subject lines all the time in order to learn about interests. Here are some strategies they optimize now:

  • Nurture the active segment. Identify those people who enjoy daily promotions, and do not complain. This has boosted revenues from a core group of loyal enthusiasts. It’s become a testing playground, where results from offer testing on an active audience yield very actionable results.
  • Respect the inactive segment. Similarly, scale back frequency for subscribers who are not active (no opens, clicks, or conversions in the past six months). These low-engagement customers now get fewer email messages, but are sent “win back” campaigns every other month, each with special offers designed to spark an action. The win back campaigns re-engage about 2 percent of this file every time. When someone in this cell becomes “active,” they move into the other audience profile for a different messaging stream.
  • Go where the people are. Use social media to invite opt-in email subscription requests. Tracking these subscribers to see if they respond differently has resulted in an audience profile of active “influencers.” These are people who share email offers and talk about their purchases in social media. Encouraging and rewarding that behavior improves the connection with this group and provides new reach.
  • Ask everywhere, prominently. They test the email sign-up on the website with different offers and moving it around the pages. In one test, by creating an interactive experience with pop-ups and custom landing pages, they reduced the time from email sign-up to first purchase by 21 days (so it takes 21 days less to convert a customer for the first time.)

All this innovation and improved revenue from an “old staid” channel like email? Yep. A great example of how a test-driven optimization mindset makes all the difference.

What have you been testing? Can you apply some of these ideas for higher response and revenues? Let us know in the comments section below.

Related reading