How to Improve A/B Testing

  |  April 29, 2005   |  Comments

The science of A/B testing.

A/B testing is a proven means to increase conversion rate. It's not as simple as it appears. It's more than simply testing two or more versions of Web pages, banners, search ads, or whatever persuasive element you can imagine. It also has limitations.

A/B testing, unlike the intuitive creative process, must be treated like a true scientific experiment. Many clients come to us frustrated by the results of their tests. Almost without fail, we discover their tests don't conform to experiment design. Most just heave two versions of an element at the wall to see what sticks.

Unless you're one of those learn-the-hard-way types, benefit from our experience. Here's what we've learned from seven years of trial and error.

One Change at a Time

When beginning an A/B test, establish a control and baseline for whatever you're testing (e.g., Web page, landing page, banner ad). Put your best creative effort online first. Monitor its performance over an established period.

Once a baseline is established, start optimizing the page by making one change at a time. More than one makes it impossible to determine which change actually made a difference.

A client wanted to change the way the box for their lead-generation form appeared. They thought they were testing just the form box. I pointed out changing the background color, headline, copy, and box position amounted to multiple variables and should therefore be conducted over multiple tests.

This is a limitation of A/B testing; conversion improvements can only be made incrementally.

Clicks Don't Exist in a Vacuum

Customers participate in scenarios on your site or in your marketing efforts whether you plan them or not. Each page or touch point exists in the context of what visitors saw on a previous page. When conducting an A/B test, consider all scenario elements. Often, a change to another element affects test page results.

A page with high exits or a call-to-action form with high abandonment may not actually be problems. Consider the page visitors viewed prior to the test page. How many of us have clicked a "Free iPod" banner, only to find a landing page detailing the painful process of getting a free iPod? We could optimize and A/B test the landing page ad infinitum, but the real problem is the misleading banner.

Recently, to increase an email open rate, we tested a new "free offer" subject line. The email body remained the same. Though the open rate went up, conversion rate and total conversions dropped. Instead of returning to the original subject line, we modified the email copy. Test copy conversion strongly outperformed the original.

The new subject line changed the email's context. Until we changed the body copy to echo the subject line, we were losing momentum. Users in the first test of the new subject line were simply not seeing the free offer in the original body copy.

In complex selling scenarios, many sites successfully funnel high amounts of traffic to a call-to-action form. Yet they test different versions of the form with little success. Usually this is indicates prospects don't have enough information or resolve to complete the form. The problem isn't the form; it's the scenario visitors participate in.

Small Changes, Big Differences

Never assume a seemingly inconspicuous change will bring inconspicuous results. The day of the week, a font, or a color palette shift can dramatically affect test results. Just moving a few buttons around on a checkout page can mean deep decreases in conversions.

There are really only 20 design principles that can affect conversion. Don't take any of them lightly.

Testing Two Very Different Creatives

Many conduct an A/B test on two significantly different offers or two significantly different creative efforts.

If effort A returns a conversion rate of 3.0 and effort B returns a conversion rate of 2.5, many will abandon effort B for the seemingly more efficient A. This could be a colossal mistake.

That's not a true A/B test. There's no way to determine if effort B were optimized if it perform better than optimized A. Think of this as two different tests rather than as one.

Often, we've found the two efforts speak effectively to two very different audience segments. Both may be worthy of further optimization, and their cumulative conversions would go well beyond effort A alone.

A/B Testing Is Real Work

True, effective A/B test is not for the impatient. You must be committed to true scientific application. Next time you move into the A/B laboratory, remember that A/B testing is science, not art.

Meet Bryan at Search Engine Strategies in Toronto, Canada, May 4-5, 2005.

ClickZ Live Toronto On the heels of a fantastic event in New York City, ClickZ Live is taking the fun and learning to Toronto, June 23-25. With over 15 years' experience delivering industry-leading events, ClickZ Live offers an action-packed, educationally-focused agenda covering all aspects of digital marketing. Early Bird Rates expire May 29. Register today and save!

ABOUT THE AUTHOR

Bryan Eisenberg

Bryan Eisenberg is co-founder and chief marketing officer (CMO) of IdealSpot. He is co-author of the Wall Street Journal, Amazon, BusinessWeek, and New York Times best-selling books Call to Action, Waiting For Your Cat to Bark?, and Always Be Testing, and Buyer Legends. Bryan is a keynote speaker and has keynoted conferences globally such as Gultaggen, Shop.org, Direct Marketing Association, MarketingSherpa, Econsultancy, Webcom, the Canadian Marketing Association, and others for the past 10 years. Bryan was named a winner of the Marketing Edge's Rising Stars Awards, recognized by eConsultancy members as one of the top 10 User Experience Gurus, selected as one of the inaugural iMedia Top 25 Marketers, and has been recognized as most influential in PPC, Social Selling, OmniChannel Retail. Bryan serves as an advisory board member of several venture capital backed companies such as Sightly, UserTesting, Monetate, ChatID, Nomi, and BazaarVoice. He works with his co-author and brother Jeffrey Eisenberg. You can find them at BryanEisenberg.com.

COMMENTSCommenting policy

comments powered by Disqus

Get the ClickZ Analytics newsletter delivered to you. Subscribe today!

COMMENTS

UPCOMING EVENTS

Featured White Papers

Gartner Magic Quadrant for Digital Commerce

Gartner Magic Quadrant for Digital Commerce
This Magic Quadrant examines leading digital commerce platforms that enable organizations to build digital commerce sites. These commerce platforms facilitate purchasing transactions over the Web, and support the creation and continuing development of an online relationship with a consumer.

Paid Search in the Mobile Era

Paid Search in the Mobile Era
Google reports that paid search ads are currently driving 40+ million calls per month. Cost per click is increasing, paid search budgets are growing, and mobile continues to dominate. It's time to revamp old search strategies, reimagine stale best practices, and add new layers data to your analytics.

WEBINARS

Resources

Jobs

    • SEO Specialist
      SEO Specialist (HeBS Digital) - NEW YORK                             ...
    • GREAT Campaign Project Coordinator
      GREAT Campaign Project Coordinator (British Consulate-General, New York) - New YorkThe GREAT Britain Campaign is seeking an energetic and creative...
    • Paid Search Senior Account Manager
      Paid Search Senior Account Manager (Hanapin Marketing) - BloomingtonHanapin Marketing is hiring a strategic Paid Search Senior Account Manager...