How to Improve A/B Testing

  |  April 29, 2005   |  Comments

The science of A/B testing.

A/B testing is a proven means to increase conversion rate. It's not as simple as it appears. It's more than simply testing two or more versions of Web pages, banners, search ads, or whatever persuasive element you can imagine. It also has limitations.

A/B testing, unlike the intuitive creative process, must be treated like a true scientific experiment. Many clients come to us frustrated by the results of their tests. Almost without fail, we discover their tests don't conform to experiment design. Most just heave two versions of an element at the wall to see what sticks.

Unless you're one of those learn-the-hard-way types, benefit from our experience. Here's what we've learned from seven years of trial and error.

One Change at a Time

When beginning an A/B test, establish a control and baseline for whatever you're testing (e.g., Web page, landing page, banner ad). Put your best creative effort online first. Monitor its performance over an established period.

Once a baseline is established, start optimizing the page by making one change at a time. More than one makes it impossible to determine which change actually made a difference.

A client wanted to change the way the box for their lead-generation form appeared. They thought they were testing just the form box. I pointed out changing the background color, headline, copy, and box position amounted to multiple variables and should therefore be conducted over multiple tests.

This is a limitation of A/B testing; conversion improvements can only be made incrementally.

Clicks Don't Exist in a Vacuum

Customers participate in scenarios on your site or in your marketing efforts whether you plan them or not. Each page or touch point exists in the context of what visitors saw on a previous page. When conducting an A/B test, consider all scenario elements. Often, a change to another element affects test page results.

A page with high exits or a call-to-action form with high abandonment may not actually be problems. Consider the page visitors viewed prior to the test page. How many of us have clicked a "Free iPod" banner, only to find a landing page detailing the painful process of getting a free iPod? We could optimize and A/B test the landing page ad infinitum, but the real problem is the misleading banner.

Recently, to increase an email open rate, we tested a new "free offer" subject line. The email body remained the same. Though the open rate went up, conversion rate and total conversions dropped. Instead of returning to the original subject line, we modified the email copy. Test copy conversion strongly outperformed the original.

The new subject line changed the email's context. Until we changed the body copy to echo the subject line, we were losing momentum. Users in the first test of the new subject line were simply not seeing the free offer in the original body copy.

In complex selling scenarios, many sites successfully funnel high amounts of traffic to a call-to-action form. Yet they test different versions of the form with little success. Usually this is indicates prospects don't have enough information or resolve to complete the form. The problem isn't the form; it's the scenario visitors participate in.

Small Changes, Big Differences

Never assume a seemingly inconspicuous change will bring inconspicuous results. The day of the week, a font, or a color palette shift can dramatically affect test results. Just moving a few buttons around on a checkout page can mean deep decreases in conversions.

There are really only 20 design principles that can affect conversion. Don't take any of them lightly.

Testing Two Very Different Creatives

Many conduct an A/B test on two significantly different offers or two significantly different creative efforts.

If effort A returns a conversion rate of 3.0 and effort B returns a conversion rate of 2.5, many will abandon effort B for the seemingly more efficient A. This could be a colossal mistake.

That's not a true A/B test. There's no way to determine if effort B were optimized if it perform better than optimized A. Think of this as two different tests rather than as one.

Often, we've found the two efforts speak effectively to two very different audience segments. Both may be worthy of further optimization, and their cumulative conversions would go well beyond effort A alone.

A/B Testing Is Real Work

True, effective A/B test is not for the impatient. You must be committed to true scientific application. Next time you move into the A/B laboratory, remember that A/B testing is science, not art.

Meet Bryan at Search Engine Strategies in Toronto, Canada, May 4-5, 2005.

ClickZ Live San Francisco This Year's Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) brings together the industry's leading practitioners and marketing strategists to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, this year's comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!

ABOUT THE AUTHOR

Bryan Eisenberg

Bryan Eisenberg is coauthor of the Wall Street Journal, Amazon, BusinessWeek, and New York Times bestselling books "Call to Action," "Waiting For Your Cat to Bark?," and "Always Be Testing." Bryan is a professional marketing speaker and has keynoted conferences globally such as SES, Shop.org, Direct Marketing Association, MarketingSherpa, Econsultancy, Webcom, SEM Konferansen Norway, the Canadian Marketing Association, and others. In 2010, Bryan was named a winner of the Direct Marketing Educational Foundation's Rising Stars Awards, which recognizes the most talented professionals 40 years of age or younger in the field of direct/interactive marketing. He is also cofounder and chairman emeritus of the Web Analytics Association. Bryan serves as an advisory board member of SES Conference & Expo, the eMetrics Marketing Optimization Summit, and several venture capital backed companies. He works with his coauthor and brother Jeffrey Eisenberg. You can find them at BryanEisenberg.com.

COMMENTSCommenting policy

comments powered by Disqus

Get the ClickZ Analytics newsletter delivered to you. Subscribe today!

COMMENTS

UPCOMING EVENTS

Featured White Papers

BigDoor: The Marketers Guide to Customer Loyalty

The Marketer's Guide to Customer Loyalty
Customer loyalty is imperative to success, but fostering and maintaining loyalty takes a lot of work. This guide is here to help marketers build, execute, and maintain a successful loyalty initiative.

Marin Software: The Multiplier Effect of Integrating Search & Social Advertising

The Multiplier Effect of Integrating Search & Social Advertising
Latest research reveals 68% higher revenue per conversion for marketers who integrate their search & social advertising. In addition to the research results, this whitepaper also outlines 5 strategies and 15 tactics you can use to better integrate your search and social campaigns.

WEBINARS

    Information currently unavailable

Jobs

    • Interactive Product Manager
      Interactive Product Manager (Western Governors University) - Salt Lake CityWestern Governors University, one of the 20 largest universities...
    • SEO Senior Analyst
      SEO Senior Analyst (University of Phoenix (Apollo Education Group)) - San FranciscoSEO Senior Analyst   Position Summary...
    • SEM & Biddable Media Manager
      SEM & Biddable Media Manager (Kepler Group LLC) - New YorkAs an Optimization & Innovation Manager at Kepler Group, you will be on the bleeding...