Attending 15 eMetrics Marketing Optimization Summits on both sides of the Atlantic in the last six years, I’ve had the chance to see how the event has grown and developed over the years. While some fundamentals remain the same (such as how to get analytics embedded into an organization), the debate has moved on. During three days at the San Jose, CA, event last week, I attended sessions covering mobile analytics, social media measurement, voice- of-the-customer programs, marketing mix modeling, and data integration.
One of the most interesting sessions featured a presentation by Joe Megibow, VP of global analytics and optimization, at Expedia. He outlined how the site has invested huge amounts of time and effort into integrating its various sources of customer experience feedback, specifically from OpinionLab and Tealeaf. These tools give Expedia the ability to link comments left by users on the site with their actual sessions, so the company can see the comment, then replay the user’s session to more fully understand the user’s experience that led to the comment. The integration is interesting, but what’s impressive is the detailed analysis that went into understanding and solving very specific customer experience issues.
Megibow washed some of his dirty laundry in public by explaining how he and his team had uncovered problems with the site through the voice of the customer, the impact that the problems were having on the customer experience, and steps they took to solve problems. These were very small, specific problems affecting a relatively small number of people. But when you added the problems up, they were having a significant impact on the user experience. In fact, Megibow said that since they had embarked on this systematic program, conversion rates were steadily improving. But the major success had been “winning the cultural shift of listening to customers, institutionalizing analytics in the business, and executing against the outcomes.”
Putting customers at the center of the company’s strategy was a theme also taken up by Greg Dowling, head of analysis at Nokia. He took us through the challenges of developing and implementing a global measurement strategy for a business that’s looking to develop consumer data as a strategic asset. Challenges that had to be addressed included the lack of a common language around metrics, the fragmentation and quality of the data, the lack of competencies in certain areas, and the fact that data and insights weren’t part of business processes. Anybody who works in a large global organization will probably recognize one or more of these challenges. Over a two-year period, Dowling and his team had worked to address the challenges and, in his words, “behavioral data is at the heart of our relationship with customers.”
A tough part of implementing the Nokia measurement strategy had been getting the mobile analytics strategy sorted. This was taken up in more detail in another session by Dowling and Gary Angel, president of Web analytics consultancy Semphonic. While I touched on the issues around mobile analytics in my last column, this session really showed how tough it is at the moment to get decent data on user behavior on mobile devices, let along integrate them with the same users’ behavior on the fixed Web. All the issues that impact our ability to measure on the fixed Web 10 years or so ago plague the mobile Web, such as data collection methodologies and browser standards. There are challenges around visitor identification and measuring mobile applications use. However, by 2012 it is expected that more mobile phones will be accessing the Internet worldwide than PCs. Clearly, it’s an area that organizations must address. And given the evidence I saw in San Jose, the sooner they start to think about their mobile measurement strategy, the better.
Having said that, many organizations still have a way to go in sorting out their fixed Web measurement strategy. WebTrends’ contribution to the debate was the release and publication of its Digital Marketing Maturity Model (DM3). This model provides a framework against which organizations can assess the maturity of the measurement capabilities on a number of different dimensions. I haven’t had the chance yet to look at it in detail, so it’s something for next time. Till then…