3 Reasons Why Analytics Disappoints and How Not to Be Disappointed

  |  July 16, 2012   |  Comments

Your disappointing analytics could be due to business misalignment, poor implementation, or company politics.

disappointinganalyticsMany analytics experts will probably agree they very often come across organizations where analytics is not only "unadvanced" but disappointing. Where analytics isn't being adopted by executives who need to make decisions based on performance data. Where, for one reason or another, analytics isn't delivering much value at all.

Rare is the case where analytics is entirely tossed out (that would be an admission of failure!). Much more common is the "back burner" syndrome. Analytics chugs along quietly, ineffectively, inaccurately, unnoticed.

Here are three major reasons why analytics may seem a disappointment (and remedies for same):

No. 1: Business Misalignment

Ask the question "why do you have a website?" and sometimes the response is a deafening silence. No one wants to say "because everybody has one" because they know that can't be a good enough reason and they don't want to say "to sell stuff" because they know that's too broad, and after that you're liable to find five different answers from five different constituents.

Dumb as the question sounds, it's the one that, when answered incompletely or carelessly, generates the most frustration with analytics.

If you don't know why you have a site, it means you don't know what matters to you about user behavior. If you don't know what matters to you about user behavior, you don't know what kind of behaviors to measure. If you don't know what to measure, you just measure everything or nothing, and the result is much the same either way: meaninglessness.

How to Get Better Aligned

Convene a meeting of your business leaders and your web content stakeholders. Facilitate a discussion of what business outcome each part of the site is attempting to create. Broadly, there will be only a few possibilities and within these there may be drilldowns. But the broad business objectives for sites are generally as follows: e-commerce (sell stuff online); content/branding (spend time on the site to view messaging from your organization or your advertisers); lead generation (lead nurturing, list-building, informative downloads, sales-call triggers); and self-service (think of intranets, insurance portals, problem resolution databases). Nearly every site will fit either entirely inside one of these large buckets or may have a presence in a couple of them at once. But whatever the site seeks in terms of business outcomes will then determine the specific type of reporting you'll want out of analytics.

By customizing your analytics tool to answer the business questions generated by the abovementioned goals, you will avoid misalignment - and one of the most common and destructive failures in analytics.

No. 2: Poor Implementation

Many frustrated analysts have fallen victim to a sub-par or simply incomplete implementation. Much of the work in getting analytics to answer relevant business questions turns out to be non-obvious and not easy for the occasional user to implement correctly. This persistent complexity, when combined with architecture issues, multiple developers, and simple lack of tool-familiarity, results in what some call a "broken" implementation.

In a broken implementation, analysts will note that data seems incorrect or impossible to reconcile; calls to the tool vendor refer one back to whomever did the implementation (for example, the tagging and report building); developers often lack the understanding to even know what they did wrong; and the analyst is left without any way to do a good job for the company.

Just to give you an example of how typical this can be, imagine a fairly sizable organization deploying a Flash or Ajax module inside their digital offering. With two or three teams of developers involved in creating pages, putting tags on pages, and making pages go live, the communication is already prone to misunderstanding and error. Compounding this is a lack of understanding of how "tagging" actually works. The result is repeated failure to get the module to send tracking information to the analytics engine, even long after launch.

Result: the analyst, and the organization, is left blind as to the success of its campaign - in this case a rather expensive one.

How to Improve Implementation

Remember that page-tagging, tool implementation, and custom code development require specific expertise. Sometimes it's difficult to see that, with all the web talent in the organization, this remains a significant gap. And yet in many organizations it is enormous. Some companies look to contractors and this can be of great help assuming you get a contractor that is knowledgeable and reliable. Others hire specialized agencies to take on this task and enjoy success, though at a cost in dollars. Training for existing in-house talent is often a stopgap; but mostly, training is geared to helping folks use analytics once it already has been properly set up.

Finally, some organizations, attempting to think ahead more than others, will hire an individual employee or an internal team that will focus solely on analytics deployment. This can also be a great solution, though it has its own significant costs. And there's always the risk that the knowledge requirements shift away from the specific skill set of the hired specialist(s).

Whichever way you choose to fix a poor implementation, make sure it includes dedicated expertise - deploying deep expertise (both in business and technology) will result in a much more robust and effective analytics platform. Try not to rely on users with only a shallow understanding of how analytics tools, tags, and interactive architecture must work together in order to deliver meaningful insight.

No. 3: Company Politics

OK, lots of things fail because of company politics - not just digital analytics.

But there are particular ways that politics gets in the way of digital analytics, chiefly related to misunderstandings or misinterpretation of proper roles and responsibilities as relates to technology and content.

Put simply, the "measuring" should never be managed by the "measured." This is because no one wants to be forced to be objective about the success or failure of their own efforts. And when put in that position, they may sometimes behave in what might be called an "obstructionist" manner, even if they are otherwise very helpful and above-boards. Of course this is not universal. But it is a noticeable tendency.

In practice, this means that the agency or marketing team responsible for putting up content (especially if they are third party) should be told that the measurement of the content is going to be handled by someone else. They will also need to be told they need to cooperate as a condition of engagement.

Too often, analytics goes down a rabbit hole and never reappears once a third-party creative shop gets involved in performing analytics. And often enough it's because of a lack of throughput on the third-party agency side: either because they don't know how to tag and implement in an expert-enough manner, or they put it on their "later" pile because they have no upside in doing otherwise.

How to Get Past Company Politics

The best way to handle this problem is to think of analytics as a discrete project that needs to be assigned to a particular group that specializes in that and has a clear upside in making sure it's done properly. Almost universally, this will result in a far better state of analytics than leaving it in the hands of folks who might not have a vested interest in the success of analytics.

Aligning expertise with a properly identified business need - in this case, analytics expertise with a need for accuracy and objectivity - will drive your analytics effort away from the whirlpool of competing interests.

Adoption Is Key to Web Optimization

You'll probably find that more targeted, more accurate, more objective, more constantly reliable analytics data results in higher adoption rates. This means that people who need to look at the data will look at the data. And then they can make decisions based on what they see. But if the landscape is littered with meaningless reports, inaccuracy, and tardiness, expect low adoption and low impact. And in the end, low impact for analytics can leave your organization at a distinct disadvantage - because the competition may have figured out last week how they might stop being disappointed in their analytics.

ClickZ Live Toronto Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!

ABOUT THE AUTHOR

Andrew Edwards

Andrew is a digital marketing executive with 20 years' experience servicing the enterprise customer. Currently he is Managing Partner at Efectyv Digital, a digital marketing consulting company, and Managing Partner at Technology Leaders, a web analytics consulting firm he founded in 2002. He combines extensive technical knowledge with a broad strategic understanding of digital marketing and especially digital measurement, plus hands-on creative in the form of the written word, user-experience and traditional design.

His practice is dedicated to building customers' digital marketing success and helping them save money during the process.

He is a writer, a public speaker and a visual artist as well.

His book "Digital is Destroying Everything—and What Comes Next" will be published by Pearson in the Spring of 2014. He writes a regular column about Analytics for ClickZ, the 2013 Online Publisher of the Year. He wrote the groundbreaking "Dawn of Convergence Analytics" report which was featured at the SES show in New York, and the second report in the series will be featured at the same show in San Francisco.

In addition to speaking at SES, he has presented at eMetrics; and his session was voted one of the top ten presentations at the DMA show in Las Vegas. He is speaking again at the DMA in Chicago in the fall of 2013.

In 2004 Andrew co-founded the Digital Analytics Association and is currently a Director Emeritus. He has designed analytics training curricula for business teams and has led seminars on digital marketing subjects.

He was also an Adjunct Professor at The Pratt Institute where he taught Advanced Computer Graphics for 3 years. Andrew is also an award-winning, nationally exhibited painter.

COMMENTSCommenting policy

comments powered by Disqus

Get the ClickZ Analytics newsletter delivered to you. Subscribe today!

COMMENTS

UPCOMING EVENTS

Featured White Papers

ion Interactive Marketing Apps for Landing Pages White Paper

Marketing Apps for Landing Pages White Paper
Marketing apps can elevate a formulaic landing page into a highly interactive user experience. Learn how to turn your static content into exciting marketing apps.

eMarketer: Redefining Mobile-Only Users: Millions Selectively Avoid the Desktop

Redefining 'Mobile-Only' Users: Millions Selectively Avoid the Desktop
A new breed of selective mobile-only consumers has emerged. What are the demos of these users and how and where can marketers reach them?

Jobs

    • Contact Center Professional
      Contact Center Professional (TCC: The Contact Center) - Hunt ValleyLooking to join a workforce that prides themselves on being routine and keeping...
    • Recruitment and Team Building Ambassador
      Recruitment and Team Building Ambassador (Agora Inc.) - BaltimoreAgora, www.agora-inc.com, continues to expand! In order to meet the needs of our...
    • Design and Publishing Specialist
      Design and Publishing Specialist (Bonner and Partners) - BaltimoreIf you’re a hungry self-starter, creative, organized and have an extreme...