In a few weeks, I’ll publish the results of my 2008 Webmaster wish list and add a few items I’d like to see in 2009. The early results aren’t great; some things I asked for just didn’t materialize. But maybe it’s my fault. Why should I have expected all that stuff when I never showed what I was thankful for?
Consequently, I’m anticipating my 2009 wish list with a list of things I’m thankful for, in honor of the Thanksgiving holiday this week in the U.S. The following tools, concepts, and collaborations have made my job easier this year, and I continue to use them religiously.
Common Robots Exclusion Protocols
Traditional robots.txt exclusion protocols were pretty easy to understand, but they never really kept up with the times. I’m thankful that in early 2008, Microsoft, Yahoo, and Google joined up to expand their robots exclusion protocol to honor a series of new characters and directives that the traditional robots.txt protocol doesn’t address. This move represented a vast expansion of the power Webmasters have to control bot access to their content, not only with what shows up in SERPs (define) but also how it appears.
The perfect complement to this expanded robots protocol is Google’s robots.txt analysis tool in Webmaster Tools. This tool is a sandbox that enables you to plug in different URL exclusion patterns, pit them against specific URLs from your site, and see exactly how Google’s bot (and in theory, any major search engine crawler) will react to the code.
LinkedIn Corporate Profiles
I’m thankful that LinkedIn corporate profiles exist, of course, but more specifically, I’m excited that LinkedIn has opened them up to be crawled. A few months ago, I wrote about LinkedIn and begged that it open them up to bots, because back then you could view them only when signed in and engines would see the log-in screen when they requested the page.
Though I called for many of these changes, don’t confuse that with my taking credit for them. Assuming credit for LinkedIn cleaning up and enabling crawling of these pages is like standing outside at 5:30 a.m. and calling for the sun to come up. It was going to happen. Reputation managers, if you’re not on this one yet, get there.
Keyword Tools From Engines
For a long time, I’ve been only semisatisfied with the various independent keyword tools on the market. Their sample sizes are too small, forcing them to perform some unholy extrapolation to predict actual demand. They’re also frequently 30 to 60 days behind the current date, and they’re subject to the biases (geography, etc.) inherent in their samples.
I’m thankful that engines have begun to give us a look into their coveted user behavior data. The predictive search feature of Yahoo Search Assist is a terrific look into searchers’ minds. I like it because it returns results in which your initial keyword is interspersed throughout the search phrase, unlike Google Suggest, whose results all begin with your keyword, even if a more popular phrase might have your keyword at the end.
But don’t count Google out, because its Insights for Search tool is another top-notch addition to official engine-based keyword intelligence. Its industry sorting, date-based search, and geographic-based sorting are terrific, but it’s the “top searches” and “rising searches” that I really dig.
Compared to 10, even 5 years ago, the wealth of information and resources available to search marketers today is astounding. Unfortunately, your competitors have access to the same information you do. Of course you’re a ClickZ reader, which automatically takes you one more step ahead again. And the fact that you got all the way to the end of this column is something for which I’m truly thankful, not to mention a little surprised.
Join us for Search Engine Strategies Chicago December 8-12 at the Chicago Hilton. The only major search marketing conference and expo in the Midwest will be packed with 60-plus sessions, multiple keynotes and Orion Strategy sessions, exhibitors, networking events, and more.