EmailWhat did we learn from the 2016 US election? Not what you think
What did we learn from the 2016 US election? Not what you think
As the United States makes way for a new resident in the White House, I've been thinking about the election that led up to it. Others have pontificated about the impact email had on the presidential campaigns, but I'm not buying any of it.
As the United States makes way for a new resident in the White House, I’ve been thinking about the election that led up to it. Others have pontificated about the impact email had on the presidential campaigns, but I’m not buying any of it.
What could we email marketers take away from a painful year-plus of electioneering? Personally, I learned a lot more than I expected about using rules and filters in my inbox and not giving away my real email address.
But, from a wider industry vantage point, I’ve learned nothing. No experts can speak with authority because no one from the campaigns has told us anything. We have no insight into what worked or didn’t. So, it’s all guesswork.
Maybe that’s not the most profound statement you’ve heard from me, but it’s critical to remember when we study our own programs to see where we can improve.
You might see a campaign another brand sent – maybe from a competitor, maybe from a brand you buy from yourself – that looks either impressive or stupid. You’ll either rush to imitate it or sneer at the sheer idiocy of the person who approved it. But, in either case, you could be setting yourself up for a humbling experience.
After years as an email strategist, I’ve learned one thing: You don’t know how well an email program or a campaign is working unless you can go behind the scenes to see how well it performed.
2012: What Obama did wrong. Or not.
In 2012, I was convinced that Barack Obama’s team was making a huge mistake with his email campaigns. They looked like emails from 20 years ago – lots of hyperlinked body copy, terrible design, overwhelming cadence. In other words, everything that we tell clients not to do.
In my arrogance, I assumed the Obama campaign was doing everything wrong, and I knew how to fix it. But I was the one who got it wrong.
I was looking on from the outside and listening to other bystanders complain about the messages and how much they hated them and speculate that recipients were opting out at epic rates and the campaigns weren’t raising money. Clearly, Obama needed one of us email experts to take the reins away from the amateurs.
Then I went to a MediaPost Email Insiders conference shortly after the election. There, up on stage giving the keynote, was Toby Fallsgraff, the campaign’s email director. He spoke in detail about email’s impact on the campaign, how extensively his team tested everything and how much money the emails delivered. See for yourself:
That speech put me in my place as a strategist and marketer. It was a humbling experience, and it reminded me of something I had forgotten over the years: Just because you don’t think it works, that doesn’t mean it doesn’t work.
Since Nov. 8, many email pundits have talked about lessons – if any – for email from the Clinton and Trump campaigns. But I will repeat what I said previously: Until someone from either campaign comes out with the details, none of us really know anything.
You can glean some facts from delivery, read rates, spam complaints and the like, but those metrics measure only external actions on the emails. These tools won’t give you the details you need, like knowing the Obama campaign tested more than 50 variants of a message or how the emails performed on campaign KPIs.
So, when you’re looking at your competition and trying to figure out what they’re doing, remember you don’t know anything unless you can go behind the scenes.
This is why we push testing so much. What worked for the Obama campaign in 2008 or 2012 (like the legendary one-word subject line, “hey”) won’t necessarily work for Dillard’s.
As you get down to the task of mapping out your email program for 2017, start by saying “I’m going to learn as much as I can without presuming to know what works and what doesn’t in other people’s programs.”