Imagine for a moment that you run a small business that depends on the phone book for most of your sales leads…maybe you’re a plumber. You know that you don’t have the money for a huge brand-building advertising campaign, so you rely on the fact that people who need your services usually end up looking in the listings when the toilet backs up.
Your listing is your lifeline. If no one can find you, the phone isn’t going to ring.
So you decide you want a spot in the white pages in the front of the phone book. You fill out the request form with your listing information, send it in, and wait for the next book to come out. When it does, you immediately open it up, jump to the “Plumbers” section and start scanning the listings. You’re not on the first page. You’re not on the second page. Heck – your listing isn’t in the book at all! In a panic, you call the phone company.
“How come my listing isn’t showing up in the book?” you ask with more than a tinge of annoyance in your voice. “I submitted my listing and I don’t see it!”
“We’re sorry,” comes the terse reply.
You notice that your biggest competitor has the top listing. “How can I get my listing to show up in the book?” you ask.
“We’re sorry, we can’t tell you that,” the voice on the line informs you.
“Well, how the heck do you decide who goes in there?” you ask, pleadingly now.
“We’re sorry. That’s proprietary information. Can I connect you with our sales office so that you can purchase a display ad?”
“No!” you exclaim, ready to throttle the drone on the other end of the line. “I just want a listing!”
“Sorry. You’ll have to submit your listing again. Have a nice day. ” And with a click, the phone line goes dead.
The Bane Of All Existence
Pretty annoying scenario, huh? Seems pretty unfair, doesn’t it?
Well, if you’re in the web business, you probably run into this Kafka-esque situation every day with the bane-of-all-our-existence search engines.
Like it or not, most people are going to find your sites in one of three ways. First, they may see the URL in an off-line media campaign. Secondly, they might click on an ad you’ve purchased. But finally, and most likely, the way to your site’s going to be through a search engine.
If you’ve ever had the joy of trying to get your site’s listing to pop to the top of the search list, you know how excruciatingly painful web search positioning can be. You tweak META tags, you fiddle with title keywords, and you try every combination of home-page copy you can imagine.
You submit your sites, you follow up, and you keep trying non-stop iterations to try to move up that list. Worst of all, you’re not dealing with just one listing, but probably five, six, or seven listings on various engines. It’s enough to drive any sane person into the padded room.
In fact, because of the complexities of web positioning, a whole slew of cottage industries have arisen to help you get your site into the top 10. There are search submission bots, automated position checkers, and META tag generators. There are companies devoted entirely to getting your page listed, and innumerable “reports” you can buy promising to let you in on all the “search engine secrets.”
That’s all well and good, and some sites like the wonderful searchenginewatch.com have done an invaluable public service in trying to keep tabs on how the engines are currently indexing pages. But the fact remains that dealing with the search engines is probably the last great arcane annoyance of web marketing.
As the search engines jockey for position (pun entirely intended), they constantly change what they’re looking for. This week, META tags are in. Next week, titles are hot.
One day, having multiple keywords on your homepage seems to work. But the next, it’s jump pages that are the key. And don’t even think that the search engines will make it easy for you to figure out what the method-of-the month is – that secret is guarded like the crown jewels. I don’t know about you, but I’m feeling my blood pressure rise to head-exploding proportions just thinking about it.
Tyranny Of The Engines
The dirty little secret of web marketing is that we’ve all become slaves to the tyranny of the search engines. In the Attention Economy, exposure is everything. If no one can find you, you might as well not have a site.
This isn’t as big of a problem if you’ve got a massive advertising budget. But for those of us who don’t have a couple of megabucks to drop on a Snap.com-sized ad campaign, it’s a major issue.
The problem is that the search engines are a product of the distributed, open nature of the Net. Because there’s no central authority (a good thing, by the way), the engines have to spend their time spidering the web, indexing pages, and then applying proprietary algorithms to determine search relevance.
Rather than wait for an engine to find you, you can register with it, but you’re still going to be at the mercy of a hunk of silicon determining where you stand. The major exception to all this is Yahoo, which has legions of lackeys indexing things by hand, resulting in an index that humans actually can use.
For us uncounted (and sometimes unwashed) legions of web marketers, these proprietary algorithms, secret indexing algorithms, and multiple spidering mechanisms have made our lives hell. Granted, some of us share the blame for driving the search engines to keep their methods secret because of shady search engine “stacking and packing” and “META-jacking” techniques.
But the fact still remains that our most vital lifelines to our customers are subject to the whims of six or seven big companies and their programmers. It’s got to stop!
Getting Their Act Together
First of all, I’d love to see the search engines make public their methods of indexing sites. Sure, the algorithms that separate the wheat from the chaff in our searches should remain proprietary, competitive information. But the criteria that makes one site appear above another shouldn’t be secret. If we knew how to get our particular sites listed in appropriate categories, the result would be more useful searches for our customers and savings for our clients, who don’t have to pay us to waste our time divining the methods.
Secondly, it will be necessary for us to police ourselves and for the search engines to police their listings. Most are doing this now, sending slime balls who use endlessly-repeated keywords and white-on-white text packing to the bottom of the list.
Third, search engines need to improve their submission mechanisms so that third-generation sites that aren’t constructed of straight HTML text get indexed as well as older sites. Now, a site that has all its text in a single multi-screen page has a much greater chance of hitting high up in the relevance scores than a site that uses Java or Shockwave for branding splash-screens. This is just plain stupid in an era where we’re moving to higher bandwidth and improved multimedia technologies.
Finally, I’d love to see some extensions to Internet protocols that would allow standardization amongst the search engines and how they find sites. Something like a SEARCH protocol (Search Engine Automatic Registration and Changes to Hypermedia) built into the Net infrastructure that automatically registered new sites and reported changes to search engines – without time-wasting human intervention.
I’m not looking for some sort of legislation or central authority to regulate the system. In fact, we’re doing a pretty good job ourselves. But in the future, as millions of more sites come online, it’ll become imperative for the Net economy to help people find what they’re looking for.
The result is more traffic, higher revenues, more money, and more satisfied customers. Not a bad deal, huh?