Understanding Digital Customer Engagement on a Physical Level

When trying to understand customers’ interactions with the digital channel, many marketers and sales teams throw around the term “engagement.” They are typically measuring tactical data such as time on site, products viewed per session, and, at the very basic level, clicks per session. This poses an interesting challenge: visitors focused on finding a specific piece of information often rapidly click through multiple pages of a website without ever reading anything. They may have viewed many products, or spent a good amount of time on the site, clicking on many links during the process. These users appear engaged without actually being engaged. To overcome this obstacle, some sites have started to put more attention on data points such as time on page and scroll tracking to better understand engagement. But even these data points don’t paint a clear enough picture of whether or not your content or functionality is interesting enough to keep a customer’s attention.

Tracking Physical Interaction With a Site Provides Better Insight Into Customer Engagement

For laptop and desktop users, tracking mouse movement and scrolling with heat maps adds a critical perspective to understanding physical interaction with a page. Now, there is much healthy debate over how closely mouse movement correlates with other levels of engagement such as eye movement. But, it is safe to say that someone who is actively moving her mouse around your page, hovering over critical elements of the page, and scrolling down the page is actually looking at the page and interpreting the elements of the page. She is actually engaged.

Tracking physical engagement gets more exciting with smartphones and tablets, as the touchscreens offer a variety of ways to track visitors’ fingers on a web page. Tracking movements such as changes in device orientation, multi-finger pinches (zooming), and swiping fingers across the screen gives site owners an accurate understanding of how engaged visitors view, as well as interact with page information. Mobile sites that track location multiple times per session can report on how much the visitor is in physical motion (e.g., walking, riding, etc.) while surfing a site. While web browsers don’t currently offer any ability to gather tracking data from the accelerometer or gyroscope, native mobile applications do allow the tracking of tilt, angle, direction, rotation, or vibration of the device. This physical interaction can then be sent back to an analytics tracking application such as Google Analytics and can show even greater context of how engaged a visitor is.

The remainder of 2013 promises to be an amazing year for analysts wanting to track a customer’s physical interaction with the digital channel. Below are several revolutionary platforms that will redefine how customer engagement is tracked.

  • The much anticipated Google Glass devices started shipping in April. The glasses are already touting reality augmentation tools that can detect what the user is looking at (e.g., the Eiffel Tower) as well as more contextual information such as directions to the closest French restaurant. Developers building applications for the platform have reported that the devices can not only trigger actions (such as take a picture) when the user wearing the glasses winks, they can also tell the magnitude of the wink and the number of winks. We can imagine how restaurant owners would be interested in knowing how many people took pictures of the meal and then uploaded the photo to Facebook with a positive (or negative) comment.
  • May was an exciting month for gamers and marketers, as both Microsoft and Sony detailed many of the new features in their next-generation gaming consoles to be sold later this year. These consoles (as well as others announced by Google and Intel) will significantly improve web browsing on televisions in many important ways. Possibly the most revolutionary aspect of these platforms is the visual recognition systems. The new Xbox and Intel’s web TV product will integrate with the Kinect 2.0, and Sony’s PS4 will integrate with the new Sony PS Eye. These visual recognition systems will give users the ability to surf the web and interact with applications without a keyboard or mouse, but instead they will interpret commands by using physical actions such as tilting your head or opening and closing your fist in the air. Intel has boisterously announced that its web TV console will allow advertisers to know the gender of active viewers as well as read the facial expressions of viewers.
  • And the pièce de résistance came last month when Samsung and Swisscom started a brilliant campaign for the Galaxy S4 by challenging prospective customers to a staring contest. That’s correct; the phone can tell when you are looking directly at it (and when you look away). Samsung also advertised the device’s ability to pause videos when viewers looked away (or fell asleep). One can only surmise how much television networks would love to know how many viewers fall asleep while watching a show (sorry C-Span). Beyond the ocular recognition, Samsung also showed off Air Gesture, which enables visitors to interact with websites and applications by moving their hands in different directions near the screen of the phone without touching it, as well as Bump, which knows when the device is bumped into another similar device for file sharing.

With every significant advancement in tracking customers, there are privacy issues to review. Privacy advocates have already started to raise concerns about Google Glass’ ability to record videos of strangers without their knowledge even before the devices have become available to the public market. For people who were legitimately concerned about an advertiser’s ability to recommend products based on user behaviors, they will be equally horrified when they see an advertisement suggesting they try Zoloft because they have been looking melancholy for the last couple of weeks.

Why Are These Technologies Significant to Organizations Wanting to Know More About Their Customers?

In 2013, we measure customer engagement by the amount of time they spend with content and whether or not they click on it. Essentially, we are tracking fingers clicking on buttons and the time in between clicks. In 2014, we can start to track a much higher level of engagement including knowing if visitors are reading, scanning, or ignoring content as well as if the content is making viewers fall asleep, cry, laugh, or even pout. We may have access data points we never imagined, such as whether an individual is viewing the content alone or a group of people together is viewing the content. These new technologies bring the promise of the biggest improvement in tracking customer data since browsers first started supporting tracking beacons. While today the best technology for tracking complex digital behaviors is a digital cookie, next year we might be able to know if the customer is reading content while actually eating a cookie.

Image on home page via Shutterstock.

Related reading