I’m back in the U.K. this week to see if my wife still remembers me and to speak at the London eMetrics Marketing Optimization Summit. My presentation takes a closer look at balancing organic and paid search — and the impact they can have on each other. I’m also putting my thoughts together on the next five years in search for colleague who’s preparing a document. And I’m in the process of researching and writing my new book.
As an information junkie, I’m usually working on a number of projects at once. Interestingly, these three projects center on the topic of search’s future. As both Google and Search Engine Strategies (part of the ClickZ Network) celebrate their 10th anniversaries this year, it seems timely to look at where search has been, where it’s at, and where it’s headed.
Earlier this year, I made a case about the diminishing value of textbook SEO and how search is changing conventional public relations. Now before you throw your hands in the air, this isn’t another “SEO Is Dead” column. Instead, I want to get as many readers as I can on board for a realistic “future of search” project. Generally speaking, I don’t bother much with search-related forums. But for a place for us to gather and chat, I’m starting a thread today at Search Engine Watch (also part of the ClickZ Network).
Long before Google, there were search engines. Even before the World Wide Web came on the scene, there were Internet search engines. In 1994, Brian Pinkerton launched WebCrawler, arguably the Web’s first full-text retrieval search engine. I spoke with Pinkerton a few times many years ago when researching the second edition of my book. He explained to me how he had applied Cornell University computer science professor Gerard Salton‘s vector space model for the extraction and analysis of text for ranking documents. At that time, SEO (define) was all about on-page factors to manipulate rankings.
A little cottage industry of search engine optimizers started to grow. And more people became aware of tactics and techniques to mess around with Web pages to outsmart competitors. Oh, what fun it was!
Then Google arrived, taking all the fun away by introducing a hyperlink-based algorithm called PageRank. Now Google wasn’t only taking into account certain on-page factors but also placing a lot more weight on information related to the linkage data surrounding Web pages. Even before Google, foremost computer scientist Jon Kleinberg had written about incorporating network theory and citation analysis into a ranking algorithm.
Ten years ago, the emphasis in SEO switched from basic page tweaking to the quest for inbound links. The industry had to change with it. Unfortunately, to paraphrase Thomas Edison, PageRank came dressed in overalls and looked like work.
Discovering knowledge from hypertext data such as text and hyperlink analysis is still a large part of a rapidly developing science under the banner of information retrieval. These signals have been very strong in helping search engines determine relevancy and rank according to authority. Yet even though industry leaders acknowledge that SEO is much more of a marketing process than a technical effort, there’s still a lot of fixation on crawler activity and indexing.
Over time, crawlers have gotten smarter and CMS (define) developers have become more aware of sites’ need to be more crawler friendly. Yet even after 10 years of research and development, inherent problems from processes related to search engine information retrieval still remain. But end-user behavior is rapidly changing.
Google’s rollout of its universal results last year has already changed the playing field completely. All the major search engines have followed Google’s lead and are discovering newer methods of uncovering patterns in different types of Web content, structure, and end-user data.
Signals from end users who previously couldn’t vote for content via links from Web pages are now able to vote for content with their clicks, bookmarks, tags, and ratings. These are very strong signals to search engines, and they don’t rely on the elitism of one Web site owner linking to another, or the often mediocre crawl of a dumb bot.
SEO will give way to a new form of digital asset management and optimization. This new SEO will place a much larger emphasis on optimizing a range of file types, from PDFs to images to audio/visual.
More effort will be placed on feeds to search engines. Not just XML feeds into paid inclusion and shopping comparison, but also feeds with other types of information, such as local, financial, news, and other verticals. Mobile will become much more popular, search will gradually become more of a personalized experience.
Personalization and digital asset optimization will end 1999-style ranking reports, as search engine results will be based on blended results from end-user specifics, such as geographic location, time of day, previous searching history, and peer group preference.
Online, monitoring the customer voice will become more important than pushing a brand message. Reputation management will become more important as marketing continues its reversal from a broadcast medium to a listening medium.
Marketing into networks will see huge growth, and social search will grow with it.
Well, that’s just me throwing some things up in the air and thinking out loud. Join me at the Search Engine Watch Forum, and let’s see if we can get some lengthy, meaningful dialogue going about search in the coming five years.
Last Friday at a packed-out Brighton SEO conference, expert local search consultant Greg Gifford delivered a fast and furious presentation on the secrets ... read more
Google’s official slogan is “Don’t Be Evil”, but it’s long been rumoured that the company has a second, internal motto that they ... read more
A report by Ofcom has found that just 60% of adults can realise that PPC ads in search results are in fact ... read more
By optimizing your website for Google, you could be sabotaging your site for Baidu in China and Yandex in Russia and Eastern Europe.