I’m sitting at my desk in a totally restrained fashion. Every fiber of my being virtually demands that I type the words: SEO (define) is dead. I want to dispose of H1 tags in the same SEO trashcan their meta-tag cousins were consigned to years ago. I want alt-text attributes to be handed back to their rightful owners: visually impaired surfers.
I never want to read another hefty tome on SEO suggesting I should waste any more time than is absolutely necessary to satisfy the needs of a mindless search engine crawler.
In 1999, I wrote my first e-guide to search engine positioning (as it was called then and should still be today). In it I addressed search engine crawlers’ greatest need: the signals to be found in a 20-year-old technology, HTML. And in 2001, I wrote about my adventures in the world of network theory and hyperlink-based ranking algorithms. Links, links, links. That was my mantra.
So why in 2008 do I still have to talk to people about SEO 1999? Dear God in heaven, forgive me for whatever it is I have done. Please, I beg you: take me out of SEO Groundhog Day.
You know, some of that age-old SEO advice could actually impede your search marketing efforts. I spend a lot of time now researching new signals to search engines. Search engines have amassed so much new information about users’ browsing habits, and as the Web has changed and grown up, users’ browsing habits have changed dramatically.
Content online is changing dramatically, too. Back in the day, most Web content was created by publishers and copywriters for company Web sites. But the vast amount of content online today is user-generated. More people create content rather than just consume it.
Social media content has become indispensable to millions of users. In particular, community question-answering networks are a popular destination for people looking for help with a particular situation, for entertainment, and for community interaction.
In fact, in some countries this type of search is more popular than browsing the results of search engines. Each day, 16 million people on average visit South Korean search portal Naver, keying in 110 million queries. Naver users also post an average of 44,000 questions a day on Knowledge iN, the interactive question-and-answer database. These receive about 110,000 answers, ranging from one-sentence replies to academic essays complete with footnotes.
With all of this consumer activity occurring, search engines are bound to look at alternative methods of discovering quality content in the future. Face it, universal search changes the old rules completely. Video is a hugely popular format for delivering quality content. And how much textbook SEO can you apply to that?
The old signals that search engines used to work on were so easy to manipulate. That’s why SEO became such a popular online sport and eventually a full-fledged industry. Keywords in prominent places, tags on a page, and good surrounding linkage data were the order of the day. Anchor text was the search workhorse. But now there are so many different signals. Social media bookmarking, tagging, and ratings send out half-decent signals. Number of plays and distribution of a video send out sharp signals. But the biggest signals come from the toolbar and end-user data.
Once you’ve done your crawler-enhancing work and achieved some search engine visibility, do you think that’s it? I don’t believe so. Once you’ve achieved that visibility, this is where consumer power kicks in.
Implicit feedback from millions of Web users has been shown to be a valuable source of result quality and ranking information, especially clicks on results and methods for interpreting the clicks. Search engine users send a lot of signals about preferred pages. Signals such as how often a specific URL is returned following a query, how many times it’s clicked on, and how long people stay on the page are pretty simple to monitor.
But more powerful is the data search engines can build following user trails. Imagine being able to track end users from the click on the search result to their final destination page, then being able to aggregate that data. This is the kind of stuff a toolbar can help build, as well as discovering so many URLs outside the crawl.
Discovering resource locations following user trails is important, too. Out of the billions of URLs, a small portion are hugely more popular than others. This is why the old idea of simply throwing up zillions of pages to see if you can get the odd click here and there is the wrong way to go now. Creating quality content that’s ultimately visited by more people more frequently is a good signal. Having a Web site with millions of redundant pages that rarely get any visits may send a signal that a particular domain hosts lower-quality content.
I’ve written before about the limitations of crawler-aggregated content. With user-generated content beating the old publisher/copywriter content by a factor of five, even more content is beyond the reach of the crawl. Which raises the question: is crawling HTML pages the most efficient way for search engines to connect end users with the content they’re looking for?
It also raises this question: is HTML, after nearly 20 years, still the right platform for sharing information on the Internet?
The Internet exists, so there’s no reason one or more new protocols couldn’t be applied to it. With all the research I’m carrying out, I find it hard to believe that in five years we’ll be sitting in front of something called a browser waiting for it to render some text and some graphics. The truly rich end-user experience on the Internet is most likely to be beyond a browser’s limitations.
I started with the desire to shout SEO is dead. But it looks as though I’ve just killed off the World Wide Web, too.
All that in the space of a day. But now I have to go. I have client who wants to talk about H tags.
Where did I put that baseball bat?
Join us for a Search Engine Marketing Training in Boston, November 6 at the Hilton Boston Back Bay. Not only will you walk away with the knowledge and skills to be a successful search engine marketer, you’ll also jumpstart your career and enhance your professional know-how.
On February 28, 2017, ClickZ presented the webinar 'Still using .com? Here’s why 50% of all Fortune 500 companies are about to use .brand' in association with Neustar.
In part one a few weeks ago, we discussed what brand TLDs (top level domains) are, which brands are applying for them and why they might be important. Today, we’ll take an in-depth look at the potential benefits for brands, and explore the challenges brand TLDs could help solve.
In 2017 it is essential that SEO professionals secure the buy-in they need from their business leaders so they can accomplish their professional goals.
Google is giving advertisers new ways to target users on YouTube.