Prove the Value of Your Analytics Tools, Team, and Web Site

Most business groups in an organization are required to defend the expense of their team and tools. Analytics teams have a particular advantage in this arena. Teams that put analytics data to use can often realize a return on investment (ROI) that quickly justifies their expenses many times over.

Tool providers are quick to point out how rapidly their tools pay for themselves. That’s not just a sales pitch. Those providers are often correct, and in some cases conservative, with their calculations if analytics data are used to improve site performance. Even small adjustments to copy, design, or site architecture can alter key visitor behaviors and yield significant financial gains. Unfortunately, those promises are often not realized, to no fault of the analytics tool.

Analytics Data in Action

With analytics data, we recently helped one client, a provider of a monthly consumer service, identify the source of lagging online sales. We discovered most visitors who viewed one of the main product pages didn’t respond to the call to action (CTA) and begin the ordering process.

Since they were having trouble with a top product offering on the site, the client decided to invest in some A/B testing to identify the most effective solution.

A/B Testing: What, Why, and How

  1. Define the page’s goal. Identify exactly what you want people to do on that page.

  2. Identify the test’s success metrics. Plan how you’ll measure the success of each layout, CTA, or user flow tested. Success metrics may include objectives such as:
    • Increase overall order conversion

    • Improve percentage of visitors who start the ordering process
    • Reduce the page’s and site’s exit rates

  3. Form a hypothesis. Ask yourself what prevents people from accomplishing the page goal. There may be several conflicting theories, such as providing too little or too much information.

  4. Determine the number of different ideas you want to test. You may want to test two or more different page designs with distinct visual, conceptual, and textual designs against the existing page.
  5. Estimate the test time to ensure statistical relevancy. Depending on the exact traffic and the actual difference between the tests, you may adjust the period once the test begins. Initial results may differ significantly from final results.
  6. Evaluate each design’s or concept’s performance.
  7. Fine-tune the best-performing designs. Focus on multivariable testing to maximize the page’s performance.

In this instance, we decided to go with five very different designs for the initial A/B test, since the page had not previously been optimized. This wasn’t a case where the page just needed to be fine-tuned; we needed to test completely different concepts and hypotheses.

Based on the initial test, we moved into phase two of the A/B test, focusing on multivariable testing. This included variables such as specific CTA wording, headline naming, and design-element placement. Each was tested on its own to lead to a final, winning design.

Cash in on the Results

After the first round of testing, our client increased the conversion (to completed order) by 0.5 percentage points. Though that may seem like a small change, the revenue was very significantly affected. Over a 12-month period, this conversion improvement is projected to generate over $1.5 million in increased online sales for just one product.

Results can be realized fairly rapidly. The entire test, from problem identification to the fine-tuning of the winning concept, was accomplished in less than two months. Most of that time the actual test was running.

Use the Data, Improve Your Site

If you aren’t acting on the data and insight analytics provides to improve site performance, you’re only looking in the rearview mirror: what happened on your site yesterday, last week, or last month. It’s good to know, but that alone won’t make your case for stellar ROI.

Related reading

site search hp