Testing may seem pointless at times. But it's a long-term investment in your e-mail marketing program.
"The problem with testing is that I lose revenue. Whichever group received the message that lost the test generates less revenue than the group that received the message which won the test. So every time I test, I am not maximizing ROI."
So said an attendee at a half-day workshop I recently led on e-mail marketing. She isn't alone. Many marketers are under a lot of pressure to maximize sales in this tough economy. And when you test, you'll have a winner -- and a loser, with lagging performance.
She's obviously experienced something I've seen with clients. Upper management looks at the results of a test and, rather than being excited about the lift generated by the winning segment, get frustrated by the additional money, traffic, or other measure of success not garnered from the losing group.
There's a feeling that everyone "knew" which version (usually the test) was better. And as such, why did we bother to test? What a waste of time -- sending the more effective (test) version to the entire list would have maximized revenue.
But here's the thing: you never really know what's going to work before you test it with a live audience. Hindsight is 20/20. While 90 percent or more of the tests I do for clients result in a lift in ROI (define), there are some, here and there, that don't. Nobody's perfect.
Some of those failed tests were things I thought, based on past experience, would improve results, but didn't. A few were things I felt certain would increase performance, because I'd had success using them with other clients. But for whatever reason, they just didn't work.
Some of the tests that worked amazingly well were things that I expected would boost response just a bit. Sometimes the simplest things can produce the greatest results -- there are times when I've projected a 3 percent lift and received a 100 percent increase in response, which is amazing.
But the better the winning version does, the worse the losing version looks and the more money you've "left on the table." It's a double-edged sword.
Full disclosure: testing is one of my favorite things. I'm a numbers person; I like to have quantifiable proof that something worked. These figures play well in presentations, like the workshop I led in Las Vegas, as well as in proposals to attract new clients.
But it's more than that -- a well-implemented A/B split or multivariate test provides solid support that the change being made is the right one. It also weeds out external factors that could inaccurately influence your results from one send to another. Deliverability issues are the big one here (if one send has higher deliverability than the other it can impact results), but even things like the day of the week and time of day can skew results.
As the holidays approach, timing becomes even more critical. People's e-mail habits change around the holidays, as people may be in the office less (B2B) (define) and checking e-mail either more or less frequently (B2C) (define) -- so e-mail results during this period may be better or worse than during more "normal" times.
If you send a "new" version of an e-mail during this time, your results will be influenced by the holiday -- and the lift or drop you see compared to previous sends may not hold up in the future. This makes A/B split testing even more important this time of year.
The key with testing is to look at it as a long-term investment in your e-mail marketing program. It's not just about the additional monies that were generated from the winning version with the single test send -- it's about the future. It's about the ongoing lift in revenue that will be recognized as the winning version becomes the control and is rolled out to the entire list.
It's also about gathering good data on what has the most impact on your list. Gathering statistically significant metrics from A/B split tests can give you a feel for which types of testing yield the best results. Then you can use this information to build a testing plan for 2010 and beyond.
Until next time,
Join the Industry's Leading eCommerce & Direct Marketing Experts in Chicago
ClickZ Live Chicago (Nov 3-6) will deliver over 50 sessions across 4 days and 10 individual tracks, including Data-Driven Marketing, Social, Mobile, Display, Search and Email. Check out the full agenda and register by Friday, August 29 to take advantage of Super Saver Rates!
Jeanne Jennings is a 20 year veteran of the online/email marketing industry, having started her career with CompuServe in the late 1980s. As Vice President of Global Strategic Services for Alchemy Worx, Jennings helps organizations become more effective and more profitable online. Previously Jennings ran her own email marketing consultancy with a focus on strategy; clients included AARP, Hasbro, Scholastic, Verizon and Weight Watchers International. Want to learn more? Check out her blog.
IBM Social Analytics: The Science Behind Social Media Marketing
80% of internet users say they prefer to connect with brands via Facebook. 65% of social media users say they use it to learn more about brands, products and services. Learn about how to find more about customers' attitudes, preferences and buying habits from what they say on social media channels.
The Multiplier Effect of Integrating Search & Social Advertising
Latest research reveals 68% higher revenue per conversion for marketers who integrate their search & social advertising. In addition to the research results, this whitepaper also outlines 5 strategies and 15 tactics you can use to better integrate your search and social campaigns.
September 17, 2014
September 23, 2014
September 30, 2014
1:00pm ET/10:00am PT