Over the past five years or so I’ve worked on the deployment of many dozens of analytics implementations. Many have been low-stress and on time; others not so much. Recently I’ve been thinking a lot about what made the good ones go smoothly, and I’ve boiled it down to five tips that I think are universal. If every organization just kept these in mind, I think there would be a lot less confusion and aggravation both during and after analytics deployment.
1: Identify SPOCs
In addition to the helpfully rational perspective Vulcan team members can provide, choosing SPOCs (Single Points of Contact) goes a really long way toward making sure your deployment will get finished on time. For an analytics deployment, you’ll need at least two: one for the marketing stakeholders, and one for the development stakeholders.
These people don’t necessarily need to be subject matter experts on analytics (that’s what the pros like me are for!) — they just need to be the ones responsible for communicating, following up, and committing to timelines.
In my experience, it’s usually better to identify a stakeholder who’s already part of your marketing and development teams, rather than bringing on an external project manager, because an embedded team member is going to have much better visibility into the other projects that are going on, dependencies, and organizational/technical limitations that may affect your deployment timeline.
2: Line Up Development Resources
I’m not sure how this happens, but sometimes I’ll be brought in to a deployment to which the marketing team is very committed — they have budget, executive buy-in, and a clear vision — but nobody seems to have told the developers that they’re going to be needed. Even if the tool is simple, even if you’re using a tag manager, 100 times out of 100 you are still going to need to invest some development resources to deploy an analytics tool successfully.
A related (and fortunately easily solved) issue is getting the development resources at the right time, for the right amount of time. It usually takes a few weeks after the project launches before developers will need to be brought in, and sometimes they’ll be there for the kickoff and then drift away since there are no action items for them immediately. Clear communication at the start of the project is the obvious solution here.
Similarly, if you’re using a modern, complex tool, it’s important to set the expectation that the code may need to be tweaked after its initial release. Sometimes developers I’ve worked with have seemed surprised that our quality assurance process uncovered bugs they need to resolve…which kind of makes me wonder what they thought we were doing QA for in the first place!
3: Use a Phased Approach
Modern websites may have well more than 100 custom dimensions and metrics they want to track. Tackling the deployment all at once can be a Herculean effort that overwhelms developers and causes challenges for quality assurance stakeholders as well. All too often, what winds up happening is the official deployment has big chunks missing, leading to a loss of confidence in the data — which can be fatal for stakeholder buy-in and makes analysis a lot harder down the road.
The best deployments I’ve done have broken the requirements up into smaller chunks — for instance, phase one might just be the global tagging and “every page” customizations, while phase two includes forms, internal search, and videos, or something like that. Another key advantage to this is that it lets you gracefully handle scope creep: if marketers keep coming up with new things they want to track, you can schedule them into later phases and stick to your current deadlines.
4: Don’t Let the Best Be the Enemy of the Good
This is kind of a reiteration of my last point, but it bears repeating. Some deployments I’ve worked on keep getting pushed back or even taken back to the drawing board because they’re not absolutely perfect. I’ve seen organizations decide to wait to deploy analytics until their site relaunch, which then gets pushed back month after month; I’ve seen marketers refuse to sign off on a deployment because 0.00006 percent of the data looks wrong to them (yes, that actually happened).
Web analytics, as we like to say, is about trends, not exact numbers. The limitations of our technology and the unpredictability of human behavior mean we need to learn to be comfortable with some degree of ambiguity in our data. And if you want to establish trends, the sooner you get started the better.
5: Socialize It!
Hey, you just invested a lot of money and energy into getting all of this data about your customers! Let the rest of your organization know about it! Give people access, hold training sessions, start an internal wiki. Does that mean people are going to come up with a bunch of weird questions? Of course. Will you be overwhelmed with help desk requests? Maybe. (I might write another post soon about ways to deal with that — stay tuned.) But on balance, I think that developing a data-driven culture of empowered individuals leads to better business decisions than keeping your analysts squirreled away like some high priests of statistics.
So those are some of the things I’ve learned working in this field, and I hope reading this can help your team get through deployment and into the fun data analysis stuff (yeah, I do consider that fun, thanks). If you’ve got any other tips to contribute, feel free to post comments/tips to contribute.
Emily Ma, product director of Tencent’s advertising platform products department, was a keynote speaker at ClickZ Live Shanghai where she discussed the ... read more
The terms that customers type into your site search function can help you to gain an understanding of user behaviour and can be used to optimise ... read more
Google Analytics comes with lots of standard reports and settings, but with a little customisation you can extract much more value. One way is ... read more