“If I Tell You the Answer, What Will You Do With It?”

An exchange with an old acquaintance of mine in the industry the other day reminded of this common conundrum. In a comment on a post he had noted that: “[There is]…still a surprising amount of fog around (1) what’s the question we are trying to answer and (2) when we have the answer, what will we do about it?”. I couldn’t agree with him more, and an experience of my own a few years back had taught me a slightly bitter lesson about making sure you understand what someone is going to use the results for.

My analytics consulting business had been retained to develop a new segmentation approach for the online business of a multichannel retailer. They weren’t happy with the one they were using and wanted a “better” one to improve their email marketing efforts. The existing segmentation was based on various rules around order frequency, average order value (AOV), and lifetime value. The view was that the existing segmentation was too blunt and they wanted something with more precision so they could target their offers more effectively.

Up to Our Armpits in Data

We set to work and decided to develop a segmentation approach based on product purchasing, so focusing on what people were actually buying rather than just how often and how much they were buying. Also, rather than use a simplistic rules-based approach, we used data mining algorithms to identify common patterns of browsing and purchasing behavior to create the different segments. This style of segmentation is a very iterative process; running the algorithms, interpreting the results, and deciding whether they made any sense of or not. We were up to our armpits in data and after a bit of huffing and puffing, we arrived at a segmentation structure that we felt was more distinctive and more precise than the one they had been using previously. “Job done,” we thought as we went off to deliver the results.

We dutifully presented the results of the project, explaining the approach we had taken and describing the different segments that we had uncovered to a mixed audience from the retailer including researchers, analysts, and marketers. As we proceeded, it became clear that there was some sense of division and uneasiness in the room. My antennae were detecting that all was not well with some members of the audience and in particular the email marketers.

To cut a long(ish) story short, the problem was this: our segmentation approach had nearly twice as many segments in it than the previous one. That was one of the results of the increased precision inherent in the approach we had used. Our proposed segmentation structure was more fine-grained in order to enable more targeted email marketing. The challenge was this: the existing email marketing resources and technology were insufficient to leverage the increased number of segments as they had neither the time, resources, or systems capabilities to generate the required increase in the number of email campaigns. We’d hit a brick wall and the new segmentation was never deployed operationally.

A Lesson Learned

This project taught me a valuable lesson. Technically, we had delivered on the brief that we had been given by the researchers, but the episode left a nasty taste in my mouth, as I’ve always believed that the value of analytics is best judged by the business changes it can effect. In fact, our analytics consulting business was called Applied Insights for that very reason! In this case, we had underestimated the retailer’s ability to execute on the insights that had been developed. There had been a disconnect between the analytical capability and the operationally capability.

On reflection, the problem stemmed from the brief that we had been given. The researchers had asked us to develop a “better segmentation.” On many levels we did that. Had the brief been to “help create more effective email marketing campaigns,” we would probably have done something different. We would have had to take into account the ability to execute and then designed the segmentation approach accordingly. In retrospect, we also should have done things differently. We should have made sure that we understood the objectives and business challenges more clearly rather than approaching it mainly as an analytical challenge. It’s something that shaped the way we engaged in the future.

As I said, a bitter lesson learned and a case study I’ve used many times in workshops on analytics as an example of how analysts have to get beyond the numbers. They need to understand what it is people are trying to do as opposed to just understanding what it is they say they want to know and then they can frame the analysis accordingly. There should always be a purpose to the question. If you’re an analyst, don’t be afraid to ask what it is. If you’re asking the question, please can you make sure you have the answer?!

Related reading

Flat business devices communication with cloud services isolated on the light blue background.