It happens all the time. You visit a Web site, hoping to tackle an important task. The home page is filled with links, buttons, pretty pictures, and calls to action. Try as you might, you can’t find the information you’re looking for, or a clear path to get there.
Why is that, especially when what you need to do seems so obvious? It’s probably because whoever designed the site didn’t stop think about how you — the user — would experience it.
Savvy agencies have entire departments devoted to the discipline of User Experience (UX). UX is the full set of physical, psychological, and emotional interactions that a person engages in while using a product or system.
With the Web, the physical experience is limited to sights, sounds, and the moving and clicking of a mouse. The mental and emotional interactions generally occur in the users’ perception and processing of the content or information.
More broadly, UX can encompass the entire relationship a business has with its customers, from advertisements to stores to packaging to telephone customer service and beyond.
A subset of User Experience, UX design, is the practice of intentionally creating user experiences for a purpose such as selling a product, solving a problem, or completing a task. It occurs in four stages: research, strategy, information architecture, and interaction design.
Done well, UX design adds up to an intuitive and natural-flowing Web site. It may even feel one step ahead of you for the simple reason that its design relied heavily on research and user data, a.k.a. analytics, to know what you want and why you’re visiting.
I asked Aaron Louie, our associate director of user experience, a few questions about the role of analytics in user experience — how they feed off of and inform each other on the way to building a Web site that works.
Shane Atchison: Typically, how do analytics and UX work together?
Aaron Louie: Both are subservient to higher-level goals. In performance marketing, what drives both analytics and user experience are the business goals and user goals. We ask the fundamental questions: “Why does the site exist?” “What do you want users to do?” and so on. The answers to these questions determine what we design and how we measure the performance of that design.
SA: Describe the typical interplay between your team and our agency’s analytics team.
AL: During discovery, we review the baseline analytics to look for potential problem issues. We then collaborate with the analytics team to conduct the goals analysis, connecting high-level user and business goals to measurable user behaviors. During design, we collaborate with the optimization team to identify and generate design variants for A/B and multivariate testing. And then post-launch, we supplement analytics data with user surveys and usability testing, providing the “why” for the “what.” Then we repeat steps one through four.
Recently, for a site redesign, I was asked to improve the user experience of a lead-generation flow starting with the home page. I asked for an analytics report on the performance of the home page. What I found from user research was that most users either bounced immediately or logged into the “my account” section. Based on the analytics, I knew where to look for the problem. This integration of qualitative and quantitative methods is an example of the intersection of logic and intuitive approaches required to attain insight into the user experience.
SA: Are there any instances where UX would disregard analytics?
AL: Hell no! We user experience designers are obsessed with doing things right. Given infinite funds and time, we’d spend all our time researching, testing, and measuring everything, just to make sure we understand the true situation. If the numbers tell us that something’s wrong with our design strategy, we will stop at nothing to discover the real reason for what we’re seeing in the analytics. And fix it.
SA: How does user experience change over the life of a project, as additional analytics data is taken into account?
AL: Not as much as we’d always like. Many projects are either set up to accomplish a major redesign or make small improvements to single pages. In both cases, there’s generally “one true design” that the client approves at the end of the project. Ideally, we’d like to move to an evolutionary process of design, where we optimize in a series of iterations, all supplemented with and informed by analytics. We’re calling this evolutionary process “performance-driven design.” Keep an eye out for some blog posts and, hopefully, a book on the topic.
We’ve actually begun this process of integration by creating a hybrid role — the “online test designer/user experience architect” — that blends optimization and user experience. Someday we’ll figure out a job title that isn’t such a mouthful.
SA: Can you give us a real-life metaphor for how analytics and UX would work together?
AL: Let’s go with the metaphor of a traveling carnival. The experience extends far beyond the gates. Every ride, every cotton candy stand, and souvenir shop exist to support the overall guest experience. Even the billboards on the highway, the view of the Ferris wheel from afar, and the sad clowns who wave goodbye from the circus train as they leave town are all part of that experience.
Who is coming to the carnival and from where? What expectations will they bring with them? How much money do they have to spend? For that matter, how does the carnival even make money? We need to know the possible revenue of every point of sale in the park and how to maximize that revenue. The caramel corn vendors need to be placed where the delicious scent of their wares will waft toward the crowds waiting in line. The games need to be tuned and placed to meet revenue targets — challenging enough to promote repeat play without being too frustrating. High-priority, high-revenue rides need to be located in high-visibility areas.
We take these analytics as a starting point, but we then turn our attention to more qualitative measurements like user satisfaction, ease of use, total experience flow, orientation, and more. Once visitors work up an appetite, do they know how to find the food? Can they find the bathrooms from the exit of a nauseating ride? Are the scary rides for adults sufficiently separated from the gentle rides for toddlers?
We can measure success against all these factors once we know what we want to accomplish. And, of course, the real test of our carnival will be in the tiny details. Is the clown makeup too creepy? Does the mural on the side of the fun house entice people of the right age? Are the corn dogs just the right balance of crispy and chewy? Analytics would tell us whether they are being purchased, but we’d need to make purely aesthetic decisions to choose the right variations to test.
This is where our metaphor breaks down (cue the sad clown). In digital, interactive spaces, multiple versions of the carnival can occur simultaneously. The entire carnival can evolve within a single visit. And it can be restructured in real time to meet the needs of different visitors. The secret is in adopting a highly collaborative, iterative, performance-driven design approach.
When measuring the effectiveness of discount codes, retailers often get it wrong. In this article, we'll look at how data-driven attribution can help businesses better understand where discount codes produce the best ROI.
Data. It’s the latest ‘buzzword’ in the digital marketing world when it comes to content.
The term ‘marketing cloud’ has gained significant traction in the last few years as major software companies have sought to monetise the growing importance of technology for marketing teams.
Digital has quite forcefully overturned the entire media industry, causing even the most traditional companies to adapt or be left behind.