Common Mistakes in Selecting and Implementing Analytics Systems
The top nine mistakes in selecting and implementing Web analytics systems.
The top nine mistakes in selecting and implementing Web analytics systems.
Online publishers have long loved analytics. It was true at the beginning, when hobbyists put page counters on personal pages to track visitor counts. It’s true today, as massive commercial publishers run complex analytics systems that spit out real-time, pulsating, 3-D page-traffic reports with scroll-over meta data and multivariable cross-tabbing capabilities.
What’s changed is people who operate Web sites are now running real businesses. Analytics systems have assumed a new importance. They’re no longer vanity dashboards; they’re essential tools that measure and manage businesses.
Almost no Web analytics system has more than a few years of performance history. Deciding which system to procure isn’t easy. To make the process a bit less perilous, I’ll outline the top nine mistakes people make selecting and implementing analytics systems. Although far short of a step-by-step buying guide, it does map the deepest potholes in the process.
1. Not Asking the Hard Question: Do We Need a System? Why?
Analytics projects driven by nothing more than management pronouncements that “we need metrics, get us some” (too many are results of exactly that) inevitably end up with bad results. Analysis for analysis sake is ridiculous. Ask the right whys to learn what data and metrics are important and can make a difference. Know which don’t matter and can be disregarded.
2. The Rearview Mirror Syndrome: Focus on Yesterday’s Metrics
Analytics projects are often driven by people enamored with answering yesterday’s questions, not tomorrow’s. It’s an easy trap to fall into. People are comfortable with what they know. When selecting traffic metrics systems, many look for what provides data and metrics the way they’re used to receiving them, not how they should receive them.
A backwards-optimized view provides comfort… at a cost. It comes at the expense of tracking metrics that can help drive business forward: numbers the organization doesn’t yet know, or what’s unpredictable and uncomfortable but exactly where focus is needed. Instead of a clear windshield to understand traffic and the road ahead, you end up with a clear view from the rearview mirror.
3. Misunderstanding Metrics and Their Methodology: Believing Apples Are Oranges and Metrics Are Consistent
Online media and marketing are immature. So are the systems that measure them. Too many people assume systems and their outputs are fully baked and results can be taken at face value.
Reality check: They aren’t. They can’t. A page view in one system isn’t a page view in another. Definitions and methodologies that determine metrics such as page views, visits, unique visitors, and ad deliveries are different in every system. In a world that lacks a standard definition for an ad impression (the currency of our industry), you can be sure there aren’t standards for all peripheral metrics. Counts can even differ between different products from the same vendor, often depending on whether you use its installed or ASP products, or how they’re configured. These differences can be significant. Counts vary 5 to 30 percent.
There’s no easy answer to this problem. The only way to deal with it is to understand how various systems count, resolve that with your own needs, then accept the compromises you have no choice but to endure.
4. Bottlenecking the Value: Leave It to the Power-User Czar
Many well-intentioned Web analytics project owners kill the project value because they bottleneck their own success. These well-meaning folks, usually the systems’ power users, create processes and procedures that place themselves at the center of everything regarding the systems: report creation, running ad hoc queries, report distribution, and troubleshooting.
Though such rules and procedures are backed by good intentions (they don’t want the systems used incorrectly; don’t want misinterpreted data), they often prevent others from taking real advantage of the systems’ intelligence and the liberation that intelligence can bring. The only solution is to train, train, train others, then step back and let them at it. Mistakes and misinterpretations happen, but the benefits of wide use across the enterprise always outweighs them.
5. The Pretty-Picture Problem: Overvaluing Data Visualization
When it comes to Web analytics, there’s nothing people love more than lots of pretty pictures. Not that there’s anything wrong with pleasing visualization tools in data presentation. Fact is, too many people worry about pictures first, data and analysis second. Pictures cloud their minds and vision, which is exactly why vendors put them there. Graphics are great at grabbing attention, but not always great at putting data into action. What looks good in reports should be a means to the end, not the end in itself.
6. Succumbing to the Conference Table Compromise
One reason analytics projects lose focus is they begin compromised. When it’s time to decide what metrics a company should track, too many organizations follow the conference table consensus approach. They worry more about consensus than about value and accuracy.
Basically, every department gets a seat at the table. Everyone contributes suggestions. The final product is a compendium of all requests. Although this method tends to create lots of good feelings between departments, it rarely results in the best set of metrics with which to run the business. Too often, the organization finds it’s tracking silly, pet metrics someone at the table thought were important but which are irrelevant.
To keep projects focused, someone must act more like a benevolent dictator and less like a socialist. Decide which metrics are important and stay, and which are distracting and go.
7. Inviting the Trojan Horse Inside the Walls: Compromising Data Ownership
When budgets are tight and everyone is clamoring for better site analytics, it’s understandable that not everyone reads or fully comprehends the fine print associated with some vendors’ “partnerships.” In these models, the vendor may reserve ownership rights of the data, data aggregates, and/or metadata derived from providing analytics services.
Although these nuances of data ownership may seem innocuous, consequences can be severe. It’s not uncommon for companies to use subsidized analytics services to create aggregated research products they sell back to the marketplace (without compensation to publishers whose sites were harvested for the data). Or, more significantly, to use analytics services to build a databases of anonymous consumer profiles and their behavior to use in ad targeting when those consumers visit other sites (again, without compensation to the publishers whose sites were harvested). It may still be a good deal, but beware.
9. Confusing Insight With Integration
What good are analysis and insight if you can’t act on them? Almost all analytics systems bill themselves as actionable. Many claim they’re real time. Learn what they mean.
Few systems can enable an enterprise to take immediate, tactical steps to leverage data for value. For most, “actionable” means the system can generate reports, such as user navigation patterns publishers can mull over in meetings, then plan changes (perhaps in the next release cycle or next quarter) to improve user experience. While that may meet the definition of actionable, it doesn’t necessarily jive with real-time action.
Bottom line: Understand, don’t assume.
This list is far from exhaustive, but does highlight some of the most common challenges of selecting and implementing a Web analytics system. Got more examples of pitfalls and potholes? Let me know what they are.