I have heard many people opine that landing page testing is a silver bullet. You pick some page elements to test, collect your data, and all of a sudden you have a better performing landing page. In fact, not all of your test plan ideas are going to make a positive impact on your conversion rate. Unfortunately, you don't know ahead of time which of your variable values will be successful. If you did, you would not need testing in the first place.
A certain kind of mental reframing is required to give proper perspective to landing page optimization:
Each Landing Page Test Has a Cost
You must expend time and effort to set up the test, monitor data collection, and analyze results. Even if your test is successful, you still have to consider alternative marketing activities that you could have been engaged in instead of testing, and the lost opportunity cost of not doing them. In other words, if you have bigger opportunities for making an impact on your program's profitability, attend to those first.
A Test May Not Yield Any Positive Results
It is possible that the tuning elements you have selected will not increase conversions. A few may actually make things worse and lower your conversion rate. At the end of the test, your original baseline recipe for the landing page may still remain the champion and will have bested all challengers. This is not a problem. You cannot base your testing program on the outcome of a single test. You will guess wrong a significant percentage of the time when selecting alternative variable values to test. But this should not deter you from trying. Testing is an ongoing activity. Until you have exhausted all of your meaningful ideas, you should keep trying.
However, it is likely that you will see a law of diminishing returns if you continue to tune the same page over and over. Chances are you will get your biggest gains during early tests, because at that point your landing page is in its worst shape and your ideas are most numerous and original. During subsequent tests, you will probably be tinkering with smaller refinements that are not as likely to produce dramatic conversion improvements. So you have to soberly evaluate whether another test on the same page is warranted.
Performance May Drop During a Test But Still Lead to Positive Results
It is possible that the mix of alternative recipes you are testing will perform worse than the baseline recipe. Often some of your variable values are worse than their baseline counterparts while others are better, creating a sampled blend of recipes that has lower overall conversion. As a result, you will see an often significant drop in revenues early in your test.
Don't panic or abort your test. Have the self-discipline to collect data with high statistical confidence.
If some variable values or recipes continue to underperform, you can eliminate them from your testing mix. Eventually, after several experiments (or follow-up test runs), you should be able to cut out the poor performers and focus on what's working the best. This may get you into positive territory (where your final challenger recipes perform better than the original baseline).
In one particular series of landing page tests that my firm ran, we started out with all of the possible recipes that we were considering. The resulting mix initially performed 19 percent poorer than the baseline. Had we ended the program at this point, we would have thought that we did not find anything better than the baseline. However, over several additional test runs we zeroed in on successively better performing recipes, and were eventually able to identify a challenger recipe that performed 27 percent better than the baseline in the final head-to-head test.
However, all is not rosy simply because of the positive outcome. The shaded portion of the graph below the "0%" line is proportional to the lost revenues during the data collection period, and the shaded portion above that line constitutes extra revenues collected during the test as a result of improved performance. As you can see, lost revenue is greater than the extra revenue, indicating that there was a net loss of revenue during the test. But this is okay because the new and improved landing page will presumably continue to outperform the original for a very long time. So the company will recoup the difference and then accrue additional extra revenues going forward.
There's No Free Lunch
You must be willing to suffer short-term pain during landing page testing in order to attain the long-term gain of improved performance and higher conversions.
Last Week to Save on SES London Tickets!
SES London takes place February 10-13, 2014. Learn to engage customers and increase ROI by distributing your online marketing efforts across paid, owned & earned media. Join the leaders of today's digital marketing & advertising industry. Find out more ››
*Saver Rates expire this Friday, Dec 13.
Tim Ash is CEO of SiteTuners.com, a landing page optimization firm that offers conversion consulting, full-service guaranteed-improvement tests, and software tools to improve conversion rates. SiteTuners' AttentionWizard.com visual attention prediction tool can be used on a landing page screenshot or mock-up to quickly identify major conversion issues. He has worked with Google, Facebook, American Express, CBS, Sony Music, Universal Studios, Verizon Wireless, Texas Instruments, and Coach.
Tim is a highly-regarded presenter at SES, eMetrics, PPC Summit, Affiliate Summit, PubCon, Affiliate Conference, and LeadsCon. He is the chairperson of ConversionConference.com, the first conference focused on improving online conversions. A columnist for several publications including ClickZ, he's host of the weekly Landing Page Optimization show and podcast on WebmasterRadio.fm. His columns can be found in the Search Engine Watch archive.
He received his B.S. and M.S. during his Ph.D. studies at UC San Diego. Tim is the author of the bestselling book, "Landing Page Optimization."
Connect with Tim on Google+.