Ending the Cloaking Debate, Part 3

In part one, I explained two different (and disreputable) search engine marketing (SEM) tactics: doorway pages and cloaking. In part two, I talked about how cloaking is not the same as spam. Today, I’ll wrap up this series by discussing cloaking in terms of paid inclusion and how to avoid trouble with the search engines.

When it comes to cloaking, paid inclusion (which all major crawlers but Google offer) is a different story. All the paid-inclusion crawlers have ways for content providers to “feed” them information via XML.

To understand the process, picture a spreadsheet with all the URLs you want listed, row by row. Information about each URL is listed in the columns: URL title in the first, description in the next, body copy in the third, and so on. It’s not really Web pages that are read, but tabular information about URLs is pumped into the search engines.

XML feeds are a form of approved cloaking. That’s not why most people use them, nor should it be the main reason to consider them. XML feeds were not intended to be a new way for marketers to cloak low-content doorway pages, but rather a simple way to feed in dynamic content, such as a product database. If you’re an online merchant, XML feeds make a lot of sense.

With all these disclaimers, there’s no doubt some people indeed use XML feeds to cloak doorway pages. Moreover, they’re approved by the search engines, as the feeds are reviewed by search engines for quality. In addition, there’s evidence being in these programs may help such content compete better for rankings than if it were picked up for “free.”

At both AltaVista and Inktomi, there’s evidence XML feeds have been an effective way for some companies to feed and cloak content that might not otherwise have met content guidelines.

Inktomi admits its XML feeds technically violate posted cloaking guidelines and says its looking to amend these. That’s not the key concern. The real issue is XML feeds, and perhaps paid inclusion in general, allow some people to provide content in a radically different way than generally accepted when content is gathered for free.

Promotions done in the past via traditional doorway pages and cloaking, which have been banned, can now be done under the guise of content feeding with the search engines that offer this. It’s almost naive to argue about whether cloaking is an acceptable delivery mechanism these days, except in Google’s case.

For others, the important issue revolves around content standards. If low-content doorway pages aren’t acceptable editorial content when found naturally by a search engine spider, are they suddenly be OK when read via paid-inclusion programs? If there’s a debate to have, this is it.

Avoid Cloaking Trouble

I believe XML feeds are a form of approved cloaking. I suspect some search engines also may allow some ordinary HTML pages to be cloaked via their non-XML paid-inclusion options, as well (something I hope to clarify in the future).

Some may argue my broad definition of cloaking means Google may knowingly allow it in some cases. It’s possible a site throwing out text-only pages may be banned by Google for “accidentally” cloaking and upon review have that penalty lifted.

Aha! Proof Google allows cloaking! So what? Google reserves the right to do whatever it wants when it comes to cloaking. It warns those who cloak “may” be permanently banned, not “will” be banned.

Maybe Google has let a site “technically” cloak, perhaps even overtly cloak. Banking on that defense should you cloak against Google is foolish. Most people who choose to show Google cloaked content do so knowing they may be caught and tossed out.

I’ll conclude with my definition of cloaking and by some additional guidelines I think will steer you away from trouble:

Cloaking is getting a search engine to record content for a URL that is different from what a searcher will ultimately see, often intentionally. It can be done in many technical ways. Several search engines have explicit bans against unapproved cloaking, of which Google is the most notable. Some people cloak without approval and never have problems. Some may even cloak accidentally.

However, if you cloak intentionally without approval — and if you deliver content to a search engine that is substantially different from what a search engine records — then you stand a much larger chance of being penalized by search engines with penalties against unapproved cloaking. If in doubt, ask the search engine if it has a problem with what you intend to do, assuming you can’t get a clear answer from written guidelines that are provided. If you are working with a third-party search engine marketer, ask them for proof that what they intend to do is approved. Otherwise, be prepared for any adverse consequences.

I’d like to say all search engines will respond promptly if asked, but they probably won’t (except to those in paid-inclusion programs). If you asked and got in trouble, you can at least prove you tried to get clarification. If you aren’t an “industrial strength” cloaker, it may help.

As for third-party firms, understand what they do. Ask if there are potential risks. Get this spelled out in advance. If you aren’t comfortable, walk away.

A professional who engages in unapproved cloaking will tell you the risks, not try to make you think cloaking content is something “everyone does.” They’ll explain why they do it, why they think it works, and what the possible downsides are. They’ll do this because they work with clients prepared to take risks. They don’t try to disguise what they do.

2003: The Year of Paid Inclusion

Let’s go back to the real issue. Content standards seem to have changed. Most crawlers have become dependent on paid inclusion for revenues.

Standards are for search engines to change, of course. Different, perhaps more liberal, standards for paid content won’t necessarily mean users or relevancy suffers. It does create confusion and raise concerns.

For paid inclusion to succeed, we need providers to be clearer about exactly what benefits and advantages are provided over unpaid content. That will help search engine marketers make purchase decisions and users to evaluate results.

I also expect that paid-inclusion content will ultimately need to be segregated from unpaid content, the more that content guidelines diverge. As I wrote in my previous article about issues with paid inclusion at AltaVista, such segregation may have positive benefits for both search engine marketers and users.

If the search engines fail to do this voluntarily, I think it’s likely we’ll see a third party such as the US Federal Trade Commission suggest it happen. In 2002, the FTC told those carrying paid-placement listings to clean up their acts. In 2003, the agency’s aim may shift to issuing new, stricter guidelines about paid-inclusion listings.

Related reading