I’ve been writing a lot lately about how data is driving the new marketing Four Ps of people, process, purpose, and platform. We’re gathering mounds of data these days. Yet, it’s true that our ability to collect the data far exceeds our ability to analyze and socialize it. And it will get more difficult until we fundamentally have the data-driven people who can tackle the basic issues of understanding it and making it relevant. The newest technology that I present below will make it even more critical that we address these issues now.
Since my last writing in this column I’ve experienced two big data moments: the first, with augmented reality (AR), and the second, how something as pervasive as your mobile calendar becomes an artificial intelligence (AI)-based data application.
For one of my clients, a major telecommunications company, I attended the Augmented World Expo in Santa Clara two weeks ago. I’m working with them to apply metrics to AR. For those who don’t know, AR technology uses computer vision and object recognition information about the surrounding real world and allows the user to become interactive, and to digitally manipulate the views through the mobile phone camera lens. The technology’s been around for a few years, but this year it seems to have crossed the infection point of ecosystem and user adoption.
Google Glass is a good example of an AR application (note my picture).
My aha moment: AR applications are able to collect a tremendous amount of data from the real world. They can “see” – view objects of interest, things like a building, a cereal box (its size and shape), a logo, toys, games, furniture and rooms, and autos. They can also “record” user-generated and behavior-based engagement metrics like duration and depth.
For an analytics guy like me, this is the Holy Grail, as I can now collect and analyze zettabytes of data about user engagement metrics within our very personal mobile devices. They can also collect what I call qualitative metrics. I can see what the user sees, I can feel her intent, and I can record camera motion – direction and angles, object engagement, colors, facial features – and marry that information with basic conversion information like share buttons, couponing, or with loyalty programs. I can test and optimize.
Now, let’s marry this to profile data about one’s preferences. Interesting, huh? Some say scary. We need to apply the new Four Ps.
In essence, the real world, like a page, becomes a trackable series of events where we can now deploy our digital analytics tools. Truly, analytics for the real world. In the last two weeks I’ve spoken with many Fortune 100 companies all working on AR products across sectors – automotive, hotel, entertainment, and financial services. Early research suggests unbelievably good conversion results. This topic will be a continuing theme of mine in this column.
My second big data confirmation moment came at the announcement of Tempo (smart calendar application) closing a large Series A funding round last week.
Tempo was incubated at SRI, the guys who developed Siri, based on AI. To learn, AI needs big data. In just a few months since its launch, Tempo has analyzed terabytes of calendar data, and was built to understand and learn from this data. It is now one of the top iPhone productivity applications (after only four months) and I’m proud to have worked as a consultant to the company.
According to Tempo CEO Raj Singh, “We enhance massive amounts of data, and the calendar serves as the perfect framework because it has real context. It knows you, what you’re doing, and where you’re going. This is the future of ‘big data’ engineering – to push the AI down into the individual bits of your calendar.”
Tempo is an app that will understand who you are. Tempo is mining individual bits of data and connecting dots that otherwise would have been trapped. Over time, it will utilize this contextual data to do more than just build a smarter calendar. The data will actually anticipate your needs, making plane or restaurant reservations all based on personal contextual data.
And based on this personal data, Tempo’s goal is to evolve into a full-bore virtual assistant and move beyond a calendar. It wants to create a pervasive assistant that anticipates your actions by processing reams of public and private user data. Tempo wants to be able to make suggestions on when you should leave for that meeting, based on real-time traffic conditions, and make suggestions for your route and where you should meet and eat. Even what you might talk about and actions you need to take.
What both of these applications have in common is using data and personalization to drive new contextual and anticipatory experiences. Data-driven AR-based apps will share and enhance what you see. Data-driven AI-based apps will help define what you do, when, how, and why you do it.
This is surely a brave new world. Let’s embrace it rather than hide from it. To do this we need to think out-of-the box. Let’s understand our new set of Four Ps where we empower these new capabilities with data-driven people, who can define data-driven process, for our new platforms, and who can most importantly define data-driven purpose. It’s only going to get more challenging and the time is now!
Image on home page via Shutterstock.
Mother’s Day is big business for brands of all kinds. The National Retail Federation reports Americans spent upwards of $170 each on gifts ... read more
At ClickZ Live New York, we spoke with Hilton Worldwide's Melissa Walner about the brand's digital spend balance and the challenges of infrequent interaction.
There seems to be something new happening everyday in the world of virtual / augmented / mixed reality. Here are some recent developments bringing the artificial world closer to reality.