Data’s Dandy. But Do a Reality Check, Too

I’ve written a lot in recent weeks about online metrics and what to watch for when you want to find meaningful measurements. Judging by the mail I’ve been getting, I’d say this is really a big issue for many of you, and I’ll keep coming back to it over the weeks to come.

But individual metrics cannot tell the whole story, and if we stop with Web metrics, we’ll miss a lot of the richness of what customer data has to tell us.

As marketers, we are especially fond of quantitative data, because it is clear and definitive, making it easier to draw conclusions. So conversion rates, customer return rates or customer profitability, break-even rates, and acquisition costs are all appealing measures of what is coming of our marketing efforts.

But numbers, as much as we love them, are never enough. Part of the task of drawing meaning from customer data requires that we go beyond the charts and graphs and the segmentations and profile pictures and actually communicate with real people.

Yes, even in this age of technology-driven mass marketing, talking to actual customers — and to those who chose not to buy — can be the best source of useful information. Not projectable information (that’s where large data files reign supreme), but information that allows our cold data to become useful, to take on meaning.

Numbers are dangerous if we don’t do a reality check, testing the findings against real human behavior.

I’m reminded of the anecdote about a car manufacturer wishing to build the perfect vehicle. It wanted a medium-priced car that would be comfortable for as many people as possible. So the company studied populations to determine the average height, weight, leg length, hip width, arm reach, and waist size of the population for a whole country. It identified average finger size, chin length, head rotation — every measure it could think of that might have an impact on a driver’s comfort behind the wheel.

Having amassed mountains of data on every body measurement one could possibly want, the manufacturer built the perfect driver’s seat. And found that not one individual could be located whose body fit that particular combination of averages. The perfectly average human simply does not exist, and the seat that should have been comfortable for the masses was comfortable for no one.

The lesson to marketers is simple: The numbers are critical to understanding patterns and setting expectations, but the work isn’t done until those assumptions are tested against real individuals. Call or email a random sampling of repeat customers and a sampling of those who did not buy. Ask them about the experience. It is a humbling but powerful process when we find that buyers and prospects do not always react to our ministrations the way we expect them to. The expected outcome is rarely the actual outcome.

The online world offers many ways to test those assumptions. Run a permission survey through pop-ups, emails to recent correspondents, or old-fashioned paper mail in the shipping container after purchase. Or pick up the phone and talk to people. We may not love answering surveys, but most people still like to know that the businesses they are interacting with care about keeping them happy. If the phone conversation is customer-focused, you’ll be surprised how many folks will tell you what you need to do to improve their experience of your business.

No matter what else you are doing to gather and analyze customer data, everyone in the business — yes, everyone — ought to spend some time each month answering phones on the customer support lines and responding to customer complaint emails. Of course, every marketer will want to do so, but don’t stop there. Add the Web designers, technologists, merchandisers, folks who set billing and collection procedures, and absolutely every executive who has a say in policy.

The marketers among us should use real customer feedback to do a reality check on the conclusions we draw from the data. A year or two ago, the online press was shouting about a study, now forgotten, that used the number of abandoned shopping carts to “prove” that online shoppers had security fears. Many of us in the market had a good laugh, knowing that just as many shopping trips were abandoned before purchase because we were forced to register each time, or the shopping cart was poorly designed, or the store did not have exactly what we’d been looking for. A few customer conversations would have cleared up that faulty assumption right away, and perhaps a few more e-tailers would still be open for business today.

Related reading

site search hp