SEO Toolbox Tips

What information is important to your natural search knowledge base? Here are a few places to check out.

If you’ve been working in the search engine optimization (SEO) field for any measure of time, then you know you have to be prepared to help fix any number of search-related breakdowns. To deal with and resolve most structural, contextual, or interlinked issues, you must have your toolbox at the ready to help diagnose the issues, formulate a plan, and make the repairs.

Most SEO toolboxes have some type of crawl simulator in them, a Lynx viewer, assorted keyword density checkers, header checkers, link auditors, and text analysis instruments — just to name a few. Yet at the end of the day, one of your most valuable tools usually turns out to be your fundamental SEO knowledge.

Producing high-quality natural search results is not getting any easier. Web site constructs are more complex than ever. Consistently producing great content is more challenging. The social sphere is ever changing. And link-building rules and regulations are in a constant state of flux. It’s very easy to get lost in the clatter searching for real signals through the noise.

Consequently, updating and refreshing your SEO toolbox requires a continual flow of information to keep up with the job’s demands. Keeping your SEO toolbox well organized and current has never been more challenging.

How do you know who to read and determine what information is important to your natural search knowledge base? Better yet, how can you make the time practice, let alone play in a sandbox?

Start by hunting down pertinent search engine blogs. It’s important to regularly read and review search engine blog posts from the official Google, Yahoo, and Microsoft Live Search sources, among others if your specialty is social media.

It’s relatively easy to organize blog-based informational resources with a RSS aggregator. My aggregator of choice is Netvibes, because it really does help me organize my digital life. But this is your SEO toolbox, so use the tool that suits your needs and you’ll capture nuggets of knowledge to perform your SEO work better.

Not all blog posts are particularly helpful, but a recent post on the official Google Webmaster Central Blog certainly caught my eye. It helped connect the dots around some key observations that folks in the industry have been touching on for quite some time. Namely the idea of preserving crawl equity and Google’s stance on finding search results in its search results.

Google Webmaster Tools has been notifying Webmasters of crawl issues for quite some time. Now, Google’s notifying Webmasters when they find “infinite space” in a Web site. To Google, infinite space consists of “very large numbers of links that usually provide little or no new content for Googlebot to index.”

Most folks working in the search industry for a while understand the idea of infinite space. The classic example Google provides in the blog post includes virtual calendars that add URL after URL ad infinitum. That certainly will burn some serious crawl time. And if Google only crawls 10 percent of your Web site on each visit, it’s expensive crawl time distracting the spiders from more interesting content within your site.

But Google isn’t just talking about virtual calendars anymore. Infinite space now includes pagination constructs and site filtering features of particular interest to e-commerce Webmasters. This is where things can get pretty interesting, especially when you strive to balance usability with natural search optimization. Specifically, Google may send Webmasters a warning when filtering and sorting features run amok:

    “Another common scenario is websites which provide for filtering a set of search results in many ways. A shopping site might allow for finding clothing items by filtering on category, price, color, brand, style, etc. The number of possible combinations of filters can grow exponentially. This can produce thousands of URLs, all finding some subset of the items sold. This may be convenient for your users, but is not so helpful for the Googlebot, which just wants to find everything — once!”

How can you tell if your crawl equity is being burned by URLs with filtering parameters and you don’t have your site set up to communicate issues and errors with the major engines? It’s pretty easy to set up Webmaster accounts. You should start there because much more information is available to you beyond crawling and indexing errors. But don’t sit back and wait for Google to send you a message.

You should isolate the URL pattern your filtering functions create and do some advanced searches to determine how big of an issue it might be. For example, you might find that the URL construct of your Web sites filtering functions always includes the word “search.” Simply extend the site query function in Google to include inurl: details. Your query will look something like: site:www.yourdomain.com inurl:search. If commonality can be found in your title tags, then use a query, site:www.yourdomain.com intitle:, and drill through the details.

If you know your Web site consists of 500 products and has five different filtering functions in place to sort by price, size, color, brand and style, you can estimate just how much of your crawl equity is being burned up by your Web site’s sorting and filtering features.

So do you need to toast your sorting features that actually help your visitors’ winnow down their product selection?

Nope.

You would not want to eliminate site feature that helps converts visitors to buyers. But you could make a visit to your sandbox and start playing with some options, such as placing your site search directory in your robots.txt file, the navigational constructs of the filtering functions, and/or using meta-x tag instructions too. Or you could just put all your filtering functionality in JavaScript or AJAX (define) and make it relatively invisible to the spiders in the first place.

Any solution you choose depends on the size of the problem. Break out your toolbox, run some diagnostics, and then test your repairs while monitoring your conversion rates. Of course, the real issue is being aware of the changes the search engines are making in the first place. It’s a great time to go get the information-gathering tools up and running right now.

Meet PJ Fusco at SES San Jose, August 18-22 at San Jose Convention Center.

Subscribe to get your daily business insights

Whitepapers

US Mobile Streaming Behavior
Whitepaper | Mobile

US Mobile Streaming Behavior

5y

US Mobile Streaming Behavior

Streaming has become a staple of US media-viewing habits. Streaming video, however, still comes with a variety of pesky frustrations that viewers are ...

View resource
Winning the Data Game: Digital Analytics Tactics for Media Groups
Whitepaper | Analyzing Customer Data

Winning the Data Game: Digital Analytics Tactics for Media Groups

5y

Winning the Data Game: Digital Analytics Tactics f...

Data is the lifeblood of so many companies today. You need more of it, all of which at higher quality, and all the meanwhile being compliant with data...

View resource
Learning to win the talent war: how digital marketing can develop its people
Whitepaper | Digital Marketing

Learning to win the talent war: how digital marketing can develop its peopl...

2y

Learning to win the talent war: how digital market...

This report documents the findings of a Fireside chat held by ClickZ in the first quarter of 2022. It provides expert insight on how companies can ret...

View resource
Engagement To Empowerment - Winning in Today's Experience Economy
Report | Digital Transformation

Engagement To Empowerment - Winning in Today's Experience Economy

1m

Engagement To Empowerment - Winning in Today's Exp...

Customers decide fast, influenced by only 2.5 touchpoints – globally! Make sure your brand shines in those critical moments. Read More...

View resource