Analytics Bridge to Nowhere

What happens when you've planned for day one but not day 150.

It was easier than we thought. It took us not much more time than we’d planned. The developers put in a universal tag. We enabled a spreadsheet. We served cookies (at least that’s what I heard). The test data came in. We clicked some buttons in the analytics tool and got dashboards. When the site launched, the early word was that we had increased our visitors.

We were happy! We bought ice cream for everyone. Our most senior marketer told me: “You have built a bridge to the future of marketing through analytics.”

Two months later, the senior marketer was gone. He had taken a position behind an espresso machine. The developers were still clinking glasses over the successful launch when they got the sack. I hid in a broom closet until the shouting stopped.

What happened?

Bridge to Nowhere

We had built a bridge to nowhere. We had no way of checking for accuracy, no way of analyzing the data that came back, no way of answering business questions, and no buy-in from anyone that they would need to change anything depending on what the analytics said.

All we had done was “implement the tool.” The tool was much like the other tools, and we followed the guidelines so that all the reports would “come out” looking the way they were supposed to. And that was where the bridge to nowhere stopped – and where we were left lingering at the edge, peering down 500 feet into a swirling maelstrom of uncertainty about what was really going on with our sites.

There were signs that things were wrong: mismatched data with our affiliates, for instance; and more mismatches from our ad networks. Then some of the content changed, and the reports just stopped making any sense. I was reminded of that commercial where the fatuous guy and his partner in the candy business are “building a website – by ourselves!”

It didn’t take long for the whole metrics structure to become creaky, unreliable, annoying, frightening, and finally useless. The project was scrapped and tears were shed.

It might have been avoided.

I met someone at a conference who told me that most successful companies don’t just “implement a tool.” What they do first is rely on a repeatable success process; then establish business goals; then build a site that tries to drive those business goals; then use a well-targeted, expertise-driven web analytics plan (using any of a number of tools) to see how close they got to matching those goals.

“Well, we did that – sort of,” I said.

“Let me guess,” said my new friend. “You did a pretty good job of planning and setting up the tool.”

“Yes.”

“You planned for day one after launch. But you didn’t plan for day 150.”

“No,” I said. And it was true.

No Blueprint

We didn’t plan for new questions coming to us from business stakeholders. We didn’t expect to have to make “the tool” jump through flaming hoops to extract small nuggets of data that would inform an important business decision. We didn’t expect that the site itself would change and require a reestablishment of goals and principles based on the changes that came from editorial. And we didn’t bother measuring them at all, because we didn’t have a template for how to do that. Nor did we have the in-house resources to spend on a full understanding of exactly how to bend and shape this tool into an insight engine. Worst, the interns they had originally put in place to “manage the tool” once it was launched had now been redeployed to “tweet” for us (and whose misguided efforts landed us in court) – but that is another story.

What we lacked most was a template for change. Because at a certain point, I got the hang of the tool, and was able to answer questions as they came up. But the problem then was that we still had the old-fashioned mud-fights about whose content had to change. Was it the content that fared worst? No, it wasn’t. The claim was that it had been improperly measured (it hadn’t been, but without third-party expertise, we had no way to claim objectivity). And what ended up getting changed was the amount of links on the home page, by a factor of at least four – because every content curator, absent measurement to tell them whether they were a light-hitting journeyman infielder or an Albert Pujols of web content, insisted on getting their square of real-estate.

And so the chaos continued. At times I thought the website itself would simply come crashing down around us in a simulacrum of sheer frustration. And we continued to pay licensing fees to the tool vendor.

Forget the Numbers

Things are better now.

The new CIO canceled everything to do with web analytics because, after studying what we had done with it, he concluded “none of it was actionable.” He won kudos for saving the company a ton of money. No license fees, no analytics experts, no time wasted on tagging, no reconciliation of numbers, none of this “trying to get it right.”

I took a walk out on the bridge the other day. It was swaying in the wind, and still only halfway done. On the other shore I thought I saw smoke signals, but after I used my binoculars I could see it was steam from hot tubs where marketers relaxed knowing they had built their analytics around a plan, had imported some senior expertise, and had made the organization respond to the news about what was pulling people to conversion and what was driving them away.

The edge of our bridge was cold and windy. The water below swirled. Looking back, I could see folks were even dismantling what bridge we’d built – figuring on getting something for it in the scrap market.

We sure had saved money by stopping before it got too late!

But those hot tubs looked awfully comfortable across the bay. And the grass around them was greener than on the side I had to go back toward. For a minute, I thought I would try to swim over to the hot tubs. But the waters looked cold and I had to get back.

What sounded better than staying with the bridge to nowhere? That I knew I could get in my car and maybe take the backroad over to the hot tubs.

This column was originally published on Nov. 7., 2011 on ClickZ.

Subscribe to get your daily business insights

Whitepapers

US Mobile Streaming Behavior
Whitepaper | Mobile

US Mobile Streaming Behavior

5y

US Mobile Streaming Behavior

Streaming has become a staple of US media-viewing habits. Streaming video, however, still comes with a variety of pesky frustrations that viewers are ...

View resource
Winning the Data Game: Digital Analytics Tactics for Media Groups
Whitepaper | Analyzing Customer Data

Winning the Data Game: Digital Analytics Tactics for Media Groups

5y

Winning the Data Game: Digital Analytics Tactics f...

Data is the lifeblood of so many companies today. You need more of it, all of which at higher quality, and all the meanwhile being compliant with data...

View resource
Learning to win the talent war: how digital marketing can develop its people
Whitepaper | Digital Marketing

Learning to win the talent war: how digital marketing can develop its peopl...

2y

Learning to win the talent war: how digital market...

This report documents the findings of a Fireside chat held by ClickZ in the first quarter of 2022. It provides expert insight on how companies can ret...

View resource
Engagement To Empowerment - Winning in Today's Experience Economy
Report | Digital Transformation

Engagement To Empowerment - Winning in Today's Experience Economy

1m

Engagement To Empowerment - Winning in Today's Exp...

Customers decide fast, influenced by only 2.5 touchpoints – globally! Make sure your brand shines in those critical moments. Read More...

View resource