Case studies can be very dangerous things.
Last week Brad, a reader of this column with whom I've become friendly, asked me for some advice. He'd conducted a test campaign that resulted in a huge discrepancy he couldn't identify. No doubt losing 90 percent of his sales, when his normal conversion rate is over 4.6 percent, distressed him. He wanted to know what caused his conversion rate to come in at just 0.47 percent. Together, we discovered the variable he'd ignored, and in his most recent test the conversion rate is back to normal.
Brad was so excited he insisted I share this case study with all of you. I hesitated to do so but promised him I'd share the data with you and let you reach your own conclusions. My concern is what is valid for Brad's business isn't valid for everyone's. Brad's product has a target market with very particular characteristics. Plus, his marketing is response driven, not branding driven. The entire buying process for his products is, though not entirely unique, not universally applicable, either.
Case studies can be very informative, but they can also do a lot of damage. Be very careful what you copy. Many companies try to emulate more successful competitors without really knowing what they're copying, because they're unaware of all the variables. What you need to do is uncover the key factors that influence your target audience to buy your particular products, then utilize this information to help you refine your sales process.
We constantly get requests for averages and benchmarks on conversion rates. We always advise against comparing your own conversion rates with "norms" or "averages." These are drawn from sites that don't have the same traffic or product as yours, and they may differ in any number of ways. We've identified over 1,100 variables in our own work alone. This is why I cringe when I hear the phrase "best practices" tossed around as if it were gospel.
To illustrate my point, I called a friend I knew could further confuse the issue. I told Sam Decker, senior manager of Dell Consumer eBusiness, what Brad did. Dell is one of the top e-commerce sites and is known for its innovative approach to measuring, testing, and optimizing. Sam is responsible for sales on the consumer Web site, so if anybody would know what to learn from Brad's case study, it would be Sam, right? Interestingly enough, Dell's own case studies prove that making Brad's "mistake" would actually help Dell's online sales.
How can it be that two case studies contradict each other so blatantly? The answer is no business is linear. There are many facets, or topological elements, to consider in designing an effective online strategy to maximize your conversion rate. Your conversion rate is only a reflection of the marketing and sales effectiveness and your customers' satisfaction. It depends! It always depends! If you're looking for one canned, simple solution, you're bound to be either bankrupt or very disappointed. I know I've just aggravated a bunch of people who are binary thinkers, but the world has lots of gray, seldom displaying true black or white.
I'm going to make this simpler on you than it was on me. I've given you two screen shots: Test A and Test B. I'd like you to let me know what the offending variable is (and, yes, there is more than one change on the page. Can you find them all?).
Here's a quick refresher on making improvements to increase conversion rates. There are thousands of improvements, big and small, you can make to improve your conversion rate. Nevertheless, you should focus on changing one thing at a time. If you change too many things at once, you may see a net increase in sales, but a change with a negative impact may have diluted a change with a positive impact. Make one change, see that it gives you X percent increase in sales, then make another change. If you start losing sales after the second change, just undo what you did and try something else, always moving your conversion rate up.
Join the Industry's Leading eCommerce & Direct Marketing Experts in Chicago
ClickZ Live Chicago (Nov 3-6) will deliver over 50 sessions across 4 days and 10 individual tracks, including Data-Driven Marketing, Social, Mobile, Display, Search and Email. Check out the full agenda and register by Friday, Oct 3 to take advantage of Early Bird Rates!
Bryan Eisenberg is coauthor of the Wall Street Journal, Amazon, BusinessWeek, and New York Times bestselling books "Call to Action," "Waiting For Your Cat to Bark?," and "Always Be Testing." Bryan is a professional marketing speaker and has keynoted conferences globally such as SES, Shop.org, Direct Marketing Association, MarketingSherpa, Econsultancy, Webcom, SEM Konferansen Norway, the Canadian Marketing Association, and others. In 2010, Bryan was named a winner of the Direct Marketing Educational Foundation's Rising Stars Awards, which recognizes the most talented professionals 40 years of age or younger in the field of direct/interactive marketing. He is also cofounder and chairman emeritus of the Web Analytics Association. Bryan serves as an advisory board member of SES Conference & Expo, the eMetrics Marketing Optimization Summit, and several venture capital backed companies. He works with his coauthor and brother Jeffrey Eisenberg. You can find them at BryanEisenberg.com.
IBM Social Analytics: The Science Behind Social Media Marketing
80% of internet users say they prefer to connect with brands via Facebook. 65% of social media users say they use it to learn more about brands, products and services. Learn about how to find more about customers' attitudes, preferences and buying habits from what they say on social media channels.
An Introduction to Marketing Attribution: Selecting the Right Model for Search, Display & Social Advertising
If you're considering implementing a marketing attribution model to measure and optimize your programs, this paper is a great introduction. It also includes real-life tips from marketers who have successfully implemented attribution in their organizations.
October 23, 2014
1:00pm ET/10:00am PT