Home  › Email › Email Marketing

Testing: Lost Revenue That Is Priceless

  |  April 5, 2010   |  Comments

Testing may seem pointless at times. But it's a long-term investment in your e-mail marketing program.

"The problem with testing is that I lose revenue. Whichever group received the message that lost the test generates less revenue than the group that received the message which won the test. So every time I test, I am not maximizing ROI."

So said an attendee at a half-day workshop I recently led on e-mail marketing. She isn't alone. Many marketers are under a lot of pressure to maximize sales in this tough economy. And when you test, you'll have a winner -- and a loser, with lagging performance.

She's obviously experienced something I've seen with clients. Upper management looks at the results of a test and, rather than being excited about the lift generated by the winning segment, get frustrated by the additional money, traffic, or other measure of success not garnered from the losing group.

There's a feeling that everyone "knew" which version (usually the test) was better. And as such, why did we bother to test? What a waste of time -- sending the more effective (test) version to the entire list would have maximized revenue.

But here's the thing: you never really know what's going to work before you test it with a live audience. Hindsight is 20/20. While 90 percent or more of the tests I do for clients result in a lift in ROI (define), there are some, here and there, that don't. Nobody's perfect.

Some of those failed tests were things I thought, based on past experience, would improve results, but didn't. A few were things I felt certain would increase performance, because I'd had success using them with other clients. But for whatever reason, they just didn't work.

Some of the tests that worked amazingly well were things that I expected would boost response just a bit. Sometimes the simplest things can produce the greatest results -- there are times when I've projected a 3 percent lift and received a 100 percent increase in response, which is amazing.

But the better the winning version does, the worse the losing version looks and the more money you've "left on the table." It's a double-edged sword.

Full disclosure: testing is one of my favorite things. I'm a numbers person; I like to have quantifiable proof that something worked. These figures play well in presentations, like the workshop I led in Las Vegas, as well as in proposals to attract new clients.

But it's more than that -- a well-implemented A/B split or multivariate test provides solid support that the change being made is the right one. It also weeds out external factors that could inaccurately influence your results from one send to another. Deliverability issues are the big one here (if one send has higher deliverability than the other it can impact results), but even things like the day of the week and time of day can skew results.

As the holidays approach, timing becomes even more critical. People's e-mail habits change around the holidays, as people may be in the office less (B2B) (define) and checking e-mail either more or less frequently (B2C) (define) -- so e-mail results during this period may be better or worse than during more "normal" times.

If you send a "new" version of an e-mail during this time, your results will be influenced by the holiday -- and the lift or drop you see compared to previous sends may not hold up in the future. This makes A/B split testing even more important this time of year.

The key with testing is to look at it as a long-term investment in your e-mail marketing program. It's not just about the additional monies that were generated from the winning version with the single test send -- it's about the future. It's about the ongoing lift in revenue that will be recognized as the winning version becomes the control and is rolled out to the entire list.

It's also about gathering good data on what has the most impact on your list. Gathering statistically significant metrics from A/B split tests can give you a feel for which types of testing yield the best results. Then you can use this information to build a testing plan for 2010 and beyond.

Until next time,



Jeanne Jennings

Jeanne Jennings is a recognized expert in the email marketing industry and managing director of digital marketing for Digital Prism Advisors. She has more than 20 years of experience in the email and online marketing and product development world. Jeanne's direct-response approach to digital strategy, tactics, and creative direction helps organizations make their online marketing initiatives more effective and more profitable. Digital Prism Advisors helps established businesses unlock significant growth and revenue opportunities in the digital marketplace; our clients learn to develop and implement successful digital strategies, leveraging data and technology to better meet bottom line goals. Want to learn more? Check out Jeanne's blog and Digital Prism Advisors.

COMMENTSCommenting policy

comments powered by Disqus

Get ClickZ Email newsletters delivered right to your inbox. Subscribe today!



Featured White Papers

2015 Holiday Email Guide

2015 Holiday Email Guide
The holidays are just around the corner. Download this whitepaper to find out how to create successful holiday email campaigns that drive engagement and revenue.

Three Ways to Make Your Big Data More Valuable

Three Ways to Make Your Big Data More Valuable
Big data holds a lot of promise for marketers, but are marketers ready to make the most of it to drive better business decisions and improve ROI? This study looks at the hidden challenges modern marketers face when trying to put big data to use.