Three Reasons Analytics Fail Companies

As I begin new analytics engagements, I often feel like Bill Murray in “Groundhog Day,” reliving the same morning over and over. I repeatedly run into smart people who install powerful Web analytics tools, then struggle to derive actionable data. Like Murray’s character, I haven’t just stood there; I’ve spotted recurring phenomena. In his case, they helped him get the girl. Mine may help you finally squeeze real return on investment (ROI) from your analytics investment.

From very large, complex multinationals to smaller Web-only groups, I see three common recurring problems.

Issue No. 1: Lack of Defined Goals/Consensus on Goals

Defining site goals is the first step when preparing an analytics plan. Without solid site goals, it’s nearly impossible to improve a site using Web analytics data.

Different people in your organization may have very different priorities. Include all key stakeholders during the goals definition process. Ideally, this process goes all the way to the top of the organization. One way to start the conversation is to ask people to define desired behavior for a site, section, even specific page. What makes a visit successful from their perspective?

With primary goals defined for each section or page, define supporting metrics for each goal. Then, it’s on to the configuration (or reconfiguration) of your reporting tool so it serves meaningful data. Remember — as your business changes, so will your goals. Revisit this process regularly.

Issue No. 2: Ineffectively Shared Analytics Data

At the start of an engagement, I frequently find only one person reviews Web analytics data: the person maintaining the analytics tool. Other people in other departments may pose one-off queries, but those who could use a top-down summary to really drive change aren’t asking for it. So they don’t get it.

This changes when you’ve first reached a broad, healthy consensus on Web site goals. Too often, the full-bore data summary lands with an intimidating thump, like the Manhattan phone book. When it’s overwhelming, people lose interest.

I create one-page graphical scorecards that report the progress toward specific site goals. This gets everyone focused on the right metrics. Recipients become more likely to use the analytics tool to expose specific audience behaviors.

Issue No. 3: Inaction on Data

Web analytics data are too often viewed as last month’s report card. The real question isn’t, “How did we do?” but, “What do we do next?” Once goals are established and stakeholders are looking at the data, it’s much easier to use those numbers to improve site performance.

Clickstream analysis may point to a problem on one particular page. That, in turn, may point to a usability expert to help pinpoint the issue or to designers or copywriters who can address it. If multiple new ideas are tested simultaneously (A/B testing), you must manage site visitor distribution among the options.

The fun part comes when companies finally get a grip on number three. When they see analytics can actually drive site performance, not just monitor it, they quickly commit to regular, cyclical, data-driven site optimization. Like Murray in “Groundhog Day,” once they stop watching things happen around them and start leveraging the information they’ve got to make meaningful changes, their fortunes change, too.

If every morning at the analytics desk seems identical and you don’t know how to get out of that reactive loop, check for one or more of these issues in your own enterprise. I can’t promise true love (that was Murray’s reward), but addressing them will make your Web site more vital — and your organization more aware of the site’s value.

Related reading

site search hp