Desperately Seeking Search Engine Marketing Standards

It seems that every so often, someone makes a new push to suggest that the search engine marketing industry needs to establish standards of conduct. The idea usually dies from a lack of support. Recently, though, several different parties have begun promoting the idea, and they might have more luck in their effort. They’ll certainly need that luck, because the barriers to establishing standards remain substantial.

The chief challenge is that there is no definitive guidebook that officially defines search engine spam. Generally, you get an attempt to artificially influence a search engine’s ability to determine relevancy, but there’s disagreement on what is artificial. Each search engine is an independent entity, which controls what it considers to be spam. For example, Google outright hates cloaking — the serving of specially designed pages exclusively to search engine robots — and will penalize for it. In contrast, both AltaVista and Inktomi allow it in certain circumstances.

As I’ve written many times before, I don’t expect that we’ll ever see a standard definition of spam. When I’ve talked to search engines about this topic, they’ve generally come back to the notion that the more specific they are about what not to do, the more people push right up to the line or gain clues to other things that they can do to spam, which are yet “undefined.”

Even if the search engines won’t define rules, perhaps the search engine marketing industry can. Indeed, the pressure to have some type of standards seems to be rising, probably as a means for search engine marketers to set themselves apart from the crowd by saying they adhere to these standards.

One example is the recently posted “Search Engine Optimization Code of Ethics,” from long-time search engine marketer Bruce Clay. Most of the code is composed of fairly common sense items that few would disagree with, such as don’t violate laws and don’t violate published spam guidelines from search engines. The main cause for controversy is that the code basically says that those following it will not cloak. The term “cloaking” itself is not used, but the description of falsely representing a Web site certainly includes it.

Search engine optimization (SEO) company WebSeed offers “The Search Engine Promotion Code of Ethics.” As with Clay’s guidelines, most of this is either common-sense things to avoid or a list of commonly accepted spam tactics that should be avoided.

One of the more controversial points might be the allowance of limited mirror sites. Similarly, the idea that “popularity-boosting and hyperlink-tag strategies are acceptable as long as they are used to promote a content-rich, highly-relevant Web page” sounds like the code is saying that creating some artificial link structures is fine, if you think a page is really good. And as with Clay’s guidelines, cloaking is seen as a no-no.

From e-Brand Management, which produces the Search Mechanics optimization tool, two recent white papers try to help Webmasters understand if they are engaging in search engine spam by examining their mindset rather than specific actions. Nevertheless, the papers still end up going overboard by labeling specific techniques as “bad.”

One of the white papers, “The Classification of Search Engine Spam,” succinctly states that search engine spam is “any attempt to artificially influence a search engine’s ability to calculate relevancy.”

No doubt many readers will immediately find this statement absurdly broad. For instance, even the search engines themselves will advise site owners to take care when crafting page titles and body copy and encourage site owners to build links. Technically, these are all “artificial” methods meant to influence search engines.

The author of the paper, e-Brand Management’s chief technology officer Alan Perkins, anticipates this concern and immediately qualifies the statement to say that such actions are not spam if they are “anything that would still be done if search engines did not exist, or anything that a search engine has given written permission to do.”

The key point Perkins is trying to make in the paper is that site owners shouldn’t be resorting to extreme methods of optimizing pages for search engines.

“Suppose search engines did not exist. Would the technique still be used in the same way?” Perkins asks. Many would readily agree with some of his examples, though not all of them.

Once again, cloaking gets called out as spam — though what the exact definition of cloaking is gets confused by the many descriptions that are presented: agent-based delivery, agent-based spam, IP delivery, and IP cloaking.

What I took away is this: If you are using a system of any type to deliver content to humans that is different than what a search engine spider sees, that’s cloaking and considered by Perkins to be spam.

Another effort on the standards front is, a new organization backed by long-time search engine marketer Terry “Webmaster T” Van Horne. The group counts about 100 search engine marketing individuals or companies as registered members. The aim is to promote best practices that search engine marketers should follow. The group plans to release an initial list of guidelines later this month.

The organization also seeks to compile a public database of search engine spammers, so that search engines and consumers can easily spot companies that are violating the group’s guidelines. Alongside this, it offers a list of search engine marketing companies that will presumably follow the guidelines, once established.

A different organization, the World Association of Internet Marketers, also aims to help establish some standards relating to search engines. The group met in September in the United Kingdom and search engine standards-related discussions are ongoing in its members forum.

The push for standards and ethics that don’t try to manipulate search engines also came up in a thread at Webmaster World, where site owner Brett Tabke eschewed the suggestion that some search engine marketers don’t try to influence search engines:

Search Engine Optimization is… the adjustment of html page entities and content for the express purpose of ranking higher on search engines. [For example,] Search Engine Optimization is the manipulation of search engine rankings systems… I bring this up, because I’ve been reading a great deal lately from seo “experts” who are very confused about what we do for a living.

However, several follow-up posts by others in the thread still tried to push the idea that there is “good” optimization that helps search engines versus “bad” optimization that manipulates or misleads them.

As you can see, there are a variety of opinions about what constitutes spam. In particular, there’s a schism between those who practice what I’ve always termed “natural” optimization, which is generally working with existing pages at a Web site to make them “search engine friendly” (another term I coined ages ago), and those who prefer to create “doorway” pages, pages usually designed to please search engines rather than humans.

To further complicate matters, there’s not a perfect division between the naturalists and the “doorwayists” (or would that be “doorwayers”?). There are many points along the spectrum, and even two naturalists might disagree on what’s acceptable, just as two doorwayists might.

Though I doubt we’ll see agreement in some areas, the desire for some type of standards is laudable. How can the search engines help? Certainly by providing as much information as they can, without feeling they are giving to much away.

Another possible way they can provide assistance is to offer tools that let people check whether a page has suffered any spam penalty. That can help those who are concerned about having accidentally spammed. Additionally, they might make available information so that Webmasters can see if a particular search engine marketing firm is causing trouble for that search engine — so that those seeking such services can be aware.

Related reading