Google's definition of Quality Score, the importance of CTR, and writing winning ads. Part one in a two-part series.
OMG. It's been over two years since I wrote a full column here about Quality Score.
It's important to refocus on this issue from time to time - as I will in this two-part series - and I'm not shy about saying why: the blizzard of "helpful advice" online is at worst flat-out wrong, typically misleading, and at best, merely poorly tailored to whatever problems you may be experiencing in your account.
Perceived low Quality Scores are near the top of the list of reasons for hard-working campaign managers to "throw in the towel" and seek professional help. (Help with PPC, I mean…although the search engines should probably establish a generous mental health fund, too, lest their Frankensteinian Quality Score creations wreak any more havoc on our tormented psyches.)
That's good: it means that the general idea behind Quality Score has percolated out into the marketplace.
Sheila (her real name, for all you know) wants to achieve volume targets on core keywords (like "baked beans") relevant to her business. With low Quality Scores, she'll be hard-pressed to do so.
Hypothetically, the difference between Quality Scores of seven and four might translate into 100 percent higher CPAs and 70 percent lower conversion volume on these keywords. It's frustrating - like trying to start a campfire with no kindling and wet wood. You get a lot of smoke.
That's the thrust of what Quality Score does to your business performance.
And the purpose behind it? Primarily - in keeping with the ethos of a search engine - to ensure that ads are relevant to the users typing in searches that may trigger ads. It's like a machine-learning-driven, highly automated version of a thinking human editorial filter that reaches out and slaps ads down or off the page when they simply aren't "quite right" or "right at all" for a given user's query.
In challenging, competitive industries, or in fields where searchers just don't like to see ads of any type, it's not that easy to go from a poor Quality Score to a great one. Indeed, it's still difficult to understand what the scores mean, and what to do about them. That varies from account to account.
It also varies from month to month. Google's actual formula, Google's means of reporting to you elements of your performance, and Google's definitions that explain how Quality Score is calculated, are in constant flux.
Lately, Google's information refers to a list of possible elements of the formula. The notion of the "historical CTR of the keyword and matched ad" has been dropped in favor of "your keyword's past CTR" and "your display URL's past CTR." But all of the bullet points are prefaced with: "we look at a number of different things related to your account, like the following…"
All of the points are open to interpretation. And while it's not deliberate, some are probably huge red herrings.
One black hole is in the "keyword/ad relevance" factor, currently defined by Google as "how relevant your keyword is to your ads." It could simply mean that you should make sure the keywords are in the ads, and one great way of doing that would be to force matching through dynamic keyword insertion (DKI). (But wouldn't that generally be captured by the fact that such matching leads to increases in CTR? If Google won't actually say what it means by keywords being relevant to ads, then how can we be sure of the best way to respond?)
Too many advertisers fixating on Quality Score are seeing these bullet points and connecting dots that don't exist. Marketing takes a back seat to Quality Score hocus-pocus. The tail wags a dog, who in turn can't see the forest for the trees.
The Importance of CTR: Simple, Right?
Despite the cagey technical writing, it's plain that Quality Score is still dominated by a single statistic, reported at the level of keywords and also vital when broken down by ad creative: CTR. Most of that is the historical CTR of that keyword in the search network on Google.
As part of a recent release, Google now explicitly mentions expected CTR in its reporting breakdowns at the keyword level. No contradiction there. At first, on a new keyword with no data, keyword Quality Score is predictive; it's based on Google's vast understanding of other advertiser behavior and presumably on how your account has performed. As time passes, "expected CTR" is weighted more in favor of your real historical CTR performance.
Typically, people hear this lecture, assume they've got CTR solved, and then go looking for something else to help them fix perceived flaws in their account. That's generally a mistake. If we stay with it, we can achieve a lot just by deeply appreciating what this CTR conversation has to offer.
Let's say you have a high CTR on certain keywords (in your mind), but you still aren't getting the Quality Score you feel you deserve. Well, consider:
This still doesn't exhaust the possibilities. There could be annoying reasons for low scores.
And there are yet further avenues for investigation and improvement. Likely, though, you should pursue them for their own sake rather than expecting any gold stars from a Quality Score standpoint. On the conference circuit, Google spokespeople have said things like "they normalize Quality Score for match type," and that "negative keywords won't improve Quality Score." While such cryptic statements may be momentarily demotivating, you should still focus vigorously on match type and negative matching strategies…for their own sake. These tactics aren't completely unrelated to Quality Score, so when Googlers make subtle points about such matters, it may be best not to read too much into them.
So what's the "really advanced" secret? Tighter ad groups? Better-themed ad groups? Local targeting? Other campaign-level settings? DKI? Landing page testing? The cleverest of negative matching tactics? None are Quality Score unlockers on their own. Using tactics to address Quality Score just because they're on a checklist will lead to unsatisfactory results. But certainly you'll be better off if you understand the PPC platforms' capabilities in depth. It wouldn't hurt to understand the intent behind Google's Landing Page and Site Quality guidelines, but don't expect Quality Score to move a hair even if you cut page load times by 40 percent.
Not so Secret: Write Winning Ads
One area that still gets ignored is the importance of creativity and structured testing to achieve high CTR on ads while holding conversion rates or CPAs equal or better.
A high CTR on ads will directly impact two of the top factors in Quality Score: the historical CTR on the keywords that trigger the ads and the recorded historical CTR on the "display URL."
Case Study in Progress: Ad Testing Works
Recently, I was asked to try to rescue Quality Score for a popular website that could not even get a decent Quality Score on its own brand. After listening to the list of kitchen sink tactics the company had tried to improve Quality Scores, I had to mention that they hadn't done very much ad testing.
Armed with a "fix-it" budget, we're going to find a true winning ad for a key ad group. We're attempting to generate high volumes for testing purposes to accelerate our exit from the Quality Score doghouse.
My initial attempts couldn't have illustrated my key point any better. Within 24 hours, it was plain that ad copy wasn't trivial and copping out via the "DKI will equal relevance" route was nowhere near good enough.
I thought all eight of the ads I wrote were pretty good. But only a few performed acceptably. I wrote four additional ad variations while clinging tightly to the tiny slivers of winning themes, to bring the total number of ads being tested to 12.
What an amazing lesson! At the 48-hour mark, the CTR on ads in this pivotal group ranged from 1.60 percent to 4.79 percent! Instant takeaway: users respond very differently depending on how you speak to them…even on the broadest of commodity keywords going to a home page.
The previous build of that ad group? While ad positions and irrelevant keywords complicate comparisons, the CTRs on the old ads had ranged from an anemic 1.03 percent all the way down to a comatose 0.15 percent.
High Quality Scores, and high ad positions on terms you want or like, won't just be handed to you. If your testing goes well, you should outdo the expectations of whoever tried and failed before (and those who didn't try). The reason you don't shoot into top ad ranks immediately, alas, is because the competition has done years of testing and has enough Quality Score history to make it tough for you to break through.
Write much better ads than you had before. It's potentially as simple as this.
I'll be back in Part 2 to reveal the big secret to Quality Score!
Goodman is founder and President of Toronto-based Page Zero Media, a full-service marketing agency founded in 2000. Page Zero focuses on paid search campaigns as well as a variety of custom digital marketing programs. Clients include Direct Energy, Canon, MIT, BLR, and a host of others. He is also co-founder of Traffick.com, an award-winning industry commentary site; author of Winning Results with Google AdWords (McGraw-Hill, 2nd ed., 2008); and frequently quoted in the business press. In recent years he has acted as program chair for the SES Toronto conference and all told, has spoken or moderated at countless SES events since 2002. His spare time eccentricities include rollerblading without kneepads and naming his Japanese maples. Also in his spare time, he co-founded HomeStars, a consumer review site with aspirations to become "the TripAdvisor for home improvement." He lives in Toronto with his wife Carolyn.
2015 Holiday Email Guide
The holidays are just around the corner. Download this whitepaper to find out how to create successful holiday email campaigns that drive engagement and revenue.
Three Ways to Make Your Big Data More Valuable
Big data holds a lot of promise for marketers, but are marketers ready to make the most of it to drive better business decisions and improve ROI? This study looks at the hidden challenges modern marketers face when trying to put big data to use.
December 2, 2015
1pm ET/ 10am PT
Wednesday, December 9, 2015
5pm HKT / 5am ET