Until you act on Web analytics data to improve site performance, your Web analytics spend return on investment (ROI) is 0.0.
Nothing, nada, zilch.
When it’s time to act on data and use either A/B or multivariate testing, be prepared for roadblocks. And, know you can overcome them.
Common roadblocks include a lack of focus on key performance indicators (KPIs), incorrect interpretation of visitor behavior, resource limitations, and lack of skills to identify potential problems and data accuracy issues. Even if you do overcome those issues, there’s one that still often arises: working with your IT department to implement the tests.
Each time we recommend a client do A/B or multivariate testing, we’re told it’s just not possible “in our environment.” This doesn’t happen once in a while. It happens every time we recommend this type of testing to an organization that hasn’t done it before. The power of testing is amazing, and multiple tests over time can return much higher results than just swapping out content, calls to action, and so forth.
The first time is the hardest, but once you get through those first few tests and everyone sees their value, it becomes much easier. A handful of issues usually lead to an immediate “no” from IT:
- IT doesn’t understand the process and how it will affect the Web site.
- IT doesn’t know testing’s prioritization in its queue.
- No one has a clear plan on how to use the analytics to improve site performance.
- There’s no executive support to help push this paradigm shift in the organization.
IT Doesn’t Understand the Process
It’s natural for the IT team to say no immediately. They frequently don’t understand the process, often are already buried, and think it will create a ton of work. There are a number of ways to mitigate this resistance.
You can work with a company such as Offermatica and Optimost for multivariate tests. You can also create a script that points people to the different versions of an A/B test, using a cookie to ensure they only see one version of the test. The latter can easily be created by a competent developer in about a day. Some analytics tools even help structure and measure test performance. Omniture currently leads this space with SiteCatalyst’s version 12 release. You still must roll out the test on your own, however.
The IT group, as well as many other groups necessary to roll out a test, probably aren’t waiting around to help. When you invite them to help with the test, they’re usually pulled away from other work. We’ve seen this most prominently within IT, as usually there are months of initiatives lined up for them to work on. The newest thing is usually added to the end of the queue. That could be months out.
To help increase testing and site optimization priority at our company, we monetize the recommendations and determine the range of return we expect to see from the test with an assigned dollar value and ROI.
No Clear Plan on How to Use Analytics
After spending money on the analytics tool and implementing and configuring the tool, the work has just begun. But most people don’t really know where to go from there. Unless a methodology is put in place and shared throughout the organization (top to bottom in the organization or Web group) no one is really going to know where to go or how the data is to be used. When someone approaches IT requesting to do a test in two weeks, with fine-tuning two weeks later, and more tests on a regular basis, it catches IT off-guard. Again, testing can typically be fairly minimal depending on the direction you choose to go.
Lack of Executive Support
Without executive support on the use of testing to constantly improve site performance, getting people to move as fast as you’d like may be tough. Share the plan, monetization, and potential outcome with a senior person. Get her buy in. Once you do, you’ll be amazed how fast barriers come down.
A few months ago, we worked with a very large organization that had invested a significant amount in its analytics tool. But it really hadn’t been able to identify opportunities and act on findings to improve site performance. After working with them to define overall site goals, proper metrics, and KPIs, we helped get the tool configured to report on those elements. We also rolled out our methodology of how to use the data.
We identified a series of priorities and started talking to the IT group about different testing opportunities. Of course, we were told it wasn’t possible on their platform and they didn’t have any resources and weren’t cleared to get any more.
We worked with IT to understand the effect and requirements for the different testing opportunities. The only real issues were resources and other priorities. We’d already gained conceptual support on the process from the executive group. We went back based on the specific recommendations and showed their monetized impact on the bottom line. We tied this right back to the overall business and site goals. We estimated the costs of conducting the first two tests working with an outside partner, as well as internal costs for the company.
Based on the changes’ estimated outcome, the decision to free up the budget and resources and to realign priorities to accomplish these goals was easy. The initial test rolled out and was within the estimated range for the monthly impact (and remains on target for the annual goal) in terms of lift in the monetized metric.
This type of testing can be incredible powerful. Don’t take no for an answer. Make sure you’re clear about the potential financial outcome, everyone understands the process, and get that executive support on the overall process.
Emily Ma, product director of Tencent’s advertising platform products department, was a keynote speaker at ClickZ Live Shanghai where she discussed the ... read more
In today's multichannel world how can marketers use data to ensure the experience a customer receives is relevant to them?
The terms that customers type into your site search function can help you to gain an understanding of user behaviour and can be used to optimise ... read more