I spent last week in Chicago for at the Search Engine Strategies (SES) conference. I always look forward to the Chicago show. It’s very seasonal, of course. And for me personally, it represents the end of the series so I can take a deep breath and relax, conference free, for a couple of months.
Right now, I’m burning the midnight oil scribbling away at my new book. A lot of my thoughts about search marketing’s future are coming together. In advance of the book, I published last week what I refer to as, not a “white paper” but more of a “thought paper.” In the paper, I examine new signals to search engines, a topic I’ve addressed in this column. And at last week’s SES conference, just about every session I moderated or took part in, there was talk of new signals.
In particular, the session on universal and blended search was all about integrating signals to connect the end user with the content they’re looking for. The panel featured presentations from myself, Larry Cornett, VP, Yahoo Search; Chris Blakely, a director at comScore; Todd Schwartz, group product manager for Live Search, Microsoft, and Jack Menzel, senior product manager, Google.
All of the search engine guys are basically on the same page when it comes to new signals. And with over one third of all queries online returning blended results (video, images, news, blog, local), there’s no doubt that the end user really is looking for (and now enjoying) a much richer experience.
Something that had me thinking is Google’s announcement a little while ago that it had access to over one trillion URLs. And added to that, it reckons the number of Web pages is growing by several billion per day.
Why should that be of any concern? Fact is, Google (nor any other search engine) is ever likely to be able to crawl that many URLs in a timely fashion. So the engines have to start thinking about new ways to connect people online with content that most satisfies their information needs — information that may be well outside the crawl.
And if that is the case, we as search marketers, have to be ahead of the curve to understand where search engines are going next. It’s ridiculous to imagine that the old-fashioned, 10 blue links ranked in a list and pulled together using a 20 year old protocol could remain appealing to a much more sophisticated always connected Internet user.
Certainly the outcome of the universal search session was that we can expect more innovation and that means more change for us.
I also enjoyed the panel on the battle of the browsers that focused on the controversy surrounding personalization and privacy issues. Moderated by Kevin Ryan, the panel featured Chris Sherman, executive editor of Search Engine Land; Gary Stein, Ammo Marketing’s director of strategy, and myself.
Again, this is something I’ve been thinking about a lot. And it was good to hear Gary and Chris pretty much on the same page with me. In order for search engines to provide better and more relevant results, they need you to provide them with more information.
We all tend to look at search engines as though they are a black box — they have mysterious technology that nobody seems to know a great deal about. And yet, the search engine looks at the end user in much the same way. The end user is a black box, which the engines know very little about.
So there must be a trade off. You let the search engine guys know more about your and your surfing/searching habits so the engines can build a profile. And in return the engines will provide a much richer experience.
With the launch of Google’s new Chrome browser there has been a lot of noise about the kind of data that Google collects. Yet, in terms of new signals, the loudest, of late, has been the toolbar. Each major search engine (and many more online organizations) have toolbar add-ons for the browser. And that information has been providing search engines with more data about user trails. And it’s these user trails that are strong at re-ranking results to better suit the end user.
So, imagine what kind of data a search engine such as Google can accrue when it owns the entire browser. But is it really a browser? that’s another question. I commented that Andrew Tomkins, vice president of Yahoo Research, had made the observation that the HTTP protocol and HTML may not be the way for us to connect with information we really want (as opposed to just being provided with what’s readily available).
Gary Stein commented that Chrome is actually processing all kinds of stuff below the surface, such as AJAX (define) for instance. And this places Chrome closer to being part of an operating system, moving away from the 20 year old idea of Tim Berners Lee’s slapping hypertext on top of the Internet and creating the World Wide Web.
And that adds even more thought to what kind of changes are likely to take place during 2009 and beyond.
You can download my thought paper, “New Signals To Search Engines: Future Proofing Your Search Marketing Strategy,” here. It’s free. Please take a little time to read it. And then I would dearly love to receive all and any feedback.
Online consumers with intent to purchase only find what they’re looking for in 50% of ecommerce searches. That needs to change. eBay ... read more
Update: Google’s Rudy Galfi, Google’s lead product manager for AMP, has revealed to Greg Sterling from Search Engine Land that the global rollout of ... read more
Three years ago, Mark Knowles wrote a thorough checklist for testing a website prior to its live launch. It was a very ... read more
Sridhar Ramaswamy, Google’s SVP of Ads & Commerce made announcements about two new products this morning at DMEXCO 2016. The first centred on ... read more