Home  › Search › Paid Search
Andrew Goodman

Filtering 'Bad' Traffic: For Best Results, Get Beyond Good and Evil

  |  December 30, 2011   |  Comments

In some parts of the world, lengthy conversations are still being held on the subject of persuading clients to devote enough budget to digital. In light of past battles nearly won, it's particularly maddening that some paid search campaign managers seem so bent on handcuffing their own accounts, that they are limiting their upside through a process of excessive filtering.

To be clear, it's important to use a means of excluding unwanted traffic - such as keyword exclusions (negative keywords). But it's also important that overall campaign strategy be driven by a game plan rather than fear or "best practices" hearsay. You're in advertising, not corporate security. If you feel like your whole job is to keep "bad" clicks away from the website, chances are you're over-filtering.

Some clients - indeed, more than half - will be timid and will go about trying new things in accounts slowly. And that's fine.

A select few clients will be gunslingers, aggressive marketers who actually love to try new things.

But never, ever should the agency or expert over-filter on behalf of the client without being absolutely certain that the client is as conservative as one might assume.

In platforms like AdWords, we've been handed wonderful tools to get very granular in excluding certain keyword phrases and display network sources (and other segments) that are almost certainly bad bets to convert for the target market. From this simple principle inevitably grew overkill. Instead of focusing on the business reasons for filtering, some marketers focused on to-do lists (to look busy); exotic strategies (to look "advanced"); and scare tactics (to win business or to sell a new tool). And instead of seeing Google's machine-learning capabilities in keyword match typing and display network placement (expanded broad match in search and automatic matching in the display network) as broadly positive developments with some negative elements that require hand-tweaking, some marketers have chosen to outright reject them and see only negative aspects.

And so the negative keyword lists and publisher exclusions lists grew. And grew and grew and grew. And sometimes they were misapplied to the whole campaign when applying them at the ad group level would have sufficed.

Sure! Powerful machine learning by the world's largest technology company, using the world's largest dataset, is 100 percent worthless! You should filter as much as you can by hand, and when that fails, get other computers involved to counteract Google's computers, willy-nilly. You should make your account into one big filter.

Hmm.

As I see it, there are three main drawbacks to this over-filtering bias:

  1. You limit volume potential and total profit overall.
  2. Because you artificially create a narrower universe, but forget just how narrow you made it (and why), when it comes time to look for creative ways to expand that finite volume (like when the client asks for more, more, more), the "out of the box" means of boosting volume you come up with turn out to be worse than some good potential traffic that was right under your nose. (Specifically, "so-so" phrases that you've so hastily negatived out, or "so-so" publishers that you've excluded, might have served some purpose to the business - moreso than grasping at straws for unproven keywords or new, exotic channels.)
  3. What I like to call the "short leash problem." When you try to anticipate and react to every possible poor-performing segment (and sub-sub-sub-segment), your analysis is actually getting too granular, and your assumptions, too causal. Mathematically, if you slice and dice everything enough, something will be coming in last place - often for no good reason. The upside of using a broader approach is that you keep your options open for random good luck. This approach may lead to more learning, and in the end, more volume and total profit.

Page:

ClickZ & Efectyv MarketingConvergence Analytics: Digital Measurement in Transition
This joint report by ClickZ and Efectyv Marketing seeks to identify how the evolution of digital analytics affects and challenges practitioners, vendors, and investors. Download it today!

COMMENTSCommenting policy

comments powered by Disqus

ABOUT THE AUTHOR

Goodman is founder and President of Toronto-based Page Zero Media, a full-service marketing agency founded in 2000. Page Zero focuses on paid search campaigns as well as a variety of custom digital marketing programs. Clients include Direct Energy, Canon, MIT, BLR, and a host of others. He is also co-founder of Traffick.com, an award-winning industry commentary site; author of Winning Results with Google AdWords (McGraw-Hill, 2nd ed., 2008); and frequently quoted in the business press. In recent years he has acted as program chair for the SES Toronto conference and all told, has spoken or moderated at countless SES events since 2002. His spare time eccentricities include rollerblading without kneepads and naming his Japanese maples. Also in his spare time, he co-founded HomeStars, a consumer review site with aspirations to become "the TripAdvisor for home improvement." He lives in Toronto with his wife Carolyn.

Get the ClickZ Search newsletter delivered to you. Subscribe today!

COMMENTS

UPCOMING EVENTS

WEBINARS

e-Learning Courses

Jobs

    • Digital Marketing Manager
      Digital Marketing Manager (Ready Set Rocket) - New York  Ready Set Rocket is seeking an experienced online marketer to join our team as the...
    • Display Media Buyer
      Display Media Buyer (Centerfield Media) - El Segundo   Centerfield Media is an online performance marketing organization specializing in...
    • Search Engine Media Buyer
      Search Engine Media Buyer (Centerfield Media) - El Segundo Centerfield Media is an online performance marketing organization specializing in lead...