When we marketing people first started looking into web data back in the mid 1990’s, we were inventing an industry. We made up words (click-through, pageview, bounce rate). We made up tools (Sawmill, Webtrends, NetGenesis). We made up processes.
One of the things they were working on was the Cross-Industry Standard Process for Data Mining (CRISP-DM). While the project never caught on like wildfire, the timing behind it and the basic premise make for a useful methodology for performing analysis and offers some solace for those of us who are laden with data and overwhelmed with business questions.
That makes it worth a quick review.
CRISP-DM divvies up data mining into six phases:
- Business understanding. What problem are you solving for?
- Data understanding. What do you have to work with?
- Data preparation. Choose and validate which data you’ll use.
- Modeling. Create a conceptual model and draw conclusions.
- Evaluation. Test how well the model holds up against data.
- Deployment. Implement the best models for making business decisions.
It seems brutally simple as a list, but think about your last project. Did you miss a step? Maybe skimp on some aspect?
Perhaps the trickiest piece of all is at the very start. Do you really know what the business problem is?
The business understanding step requires a cultural ability to collaboratively determine business objectives. It requires the proper background, clear, assented business objectives, and even clearer and more agreed-to business success criteria. This is usually a tricky political process and one that is often neglected.
You must properly enumerate the available resources, agree to specific requirements, identify areas of deficiency, and communally concur on risks and contingencies. Just getting the terminology straight can be a task that requires weeks of meetings and innumerable emails.
Once the costs and benefits are ironed out, specific data-mining goals have to be acknowledged and specific success criteria must be signed off.
Oh – and one more thing…failure criteria. When will you know the project is a failure and who has the right to pull the plug?
Only after all of this is in place can you create a project plan with specific sponsors recognized and specific outputs delineated. Then you can break it all down into explicit sub-tasks by specific team members.
If this sounds like a lot of work, it is. If this sounds like too much work, then corporate culture may not allow for a rigorous project. It may be time to reassess the likelihood of any project analytics getting traction.
Remember, you still have to work out how the data will be collected, validated, catalogued, cleansed, attributed, integrated, formatted, modelled, tested, evaluated, deployed, and applied to the business.
What? You thought this was going to be a piece of cake?
To help you along the way, there is a visual guide to CRISP-DM and a Decision Management Solutions’ Eclipse Process Framework version (download) of the CRISP-DM methodology, which includes business rules and integration of analytics and rules. It’s an open-source tool for managing methodologies both to allow developers of methodologies to share them and companies to customize them.
CRISP-DM may be a tough row to hoe, but at least you won’t have to make it up as you go along.
This column was originally published on August 2, 2012.