Site Architecture and SEO

  |  December 20, 2006   |  Comments

You have to crawl before you can run. And before creating keyword-targeted content, you must make that additional content manageable.

If we were to break SEO (define) into its simplest form, we'd discuss site architecture and navigation, temporary and permanent content, and inbound links -- in that order. It makes sense to put first things first, so we start with the way the site is built.

Before we can help create keyword-targeted content that provides users and search engine spiders with relevant themes, we must usually make additional content manageable. This means the CMS (define) must readily allow input of original permanent and temporary content on the page, as well as the entry and editing of unique title tags and meta description tags behind the page.

Before we can direct the search engine spiders toward prominent content, we must have the ability to create a natural hierarchy for page content. This means the site design, particularly the style sheets, must allow for optimal uses of header tags, ideally .

And before we can efficiently direct links into the site, we must efficiently funnel internal linkage throughout it. Since the internal linking structure must be readily crawlable, site design must come under full review and an assessment completed.

Inhibitors and Disrupters

Generally speaking, site structure frequently works to inhibit or disrupt the natural flow of internal linkage through the site. Different types of coding and programming require different remedies to overcome navigational flaws.

No amount of alternative tags in images will overcome the fact that spiders can't recognize images, for example. If site navigation is completely image-based, we must render the images as text-based links via CSS (define). This is a simple way to maintain the site's appearance while making the navigation search optimal.

When Flash navigation is used, further steps must be taken to swap out the invisible structure of unembedded links with a crawlable structure. When a site's designed entirely in Flash, we can focus on building a low-resolution solution for spiders to crawl and screen readers to see or build some contextual strength into the site with the Macromedia Flash SDK (define). Either way, we still need to link different site elements together in a manner that can be readily crawled and indexed by search engine spiders.

When AJAX (define) is used to design a site, URL creation must first be overcome. Though the user experience is no doubt exceptional on an AJAX-intensive site, the crawling experience is surely disrupted. Creating static pages on the fly is a good place to start, but these pages must be linked to each other to form a crawlable site structure. Footers and sitemaps can complement site crawling, but they are by no means an optimal solution for building greater site relevancy.

Although the major search engines can handle some dynamic parameters in URLs without choking on them, session IDs in URLs remain a spider killer. To overcome disruptive use of session IDs, URL rewrites are usually in order. But URL rewrites must commonly be accompanied by a database of permanent redirects and efficient use of the robots.txt file to keep the spiders crawling a site on a regular basis.

Appended URLs with tacking parameters create additional disruption to the flow of a site's navigation, as well as duplicate content that further inhibits building relevant themes. Rewrites and redirects can help here, too, but we might want to rethink how we track users through a site if the system inhibits a search optimal presentation of the site.

Tools to Use

Spider simulators:

For indexing audits, use "site:" command strings in Google and MSN, such as site:www.clickz.com, or Site Explorer in Yahoo.

Remember, numbers provided by the search engines are an estimated number of pages indexed. It still takes a drill-down to determine if your site suffers from any of the crawling inhibitors or disruptors discussed here today.

In Summary

What appear to be additional layers of complexity within a site structure actually make the site simpler for the spiders for crawl. Since crawling is our first hurdle toward successful indexing, it only makes sense to be certain our sites can be efficiently crawled by search engine spiders.

By playing close attention to how well search engine spiders crawl our sites, we can take the first steps toward improving the indexing of our sites. Once we understand how well the site is indexed, we can take move toward improving site relevancy on a theme-by-theme, page-by-page basis.

Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.

ClickZ Live Chicago Learn Digital Marketing Insights From Leading Brands!
ClickZ Live Chicago (Nov 3-6) will deliver over 50 sessions across 10 individual tracks, including Data-Driven Marketing, Social, Mobile, Display, Search and Email. Check out the full agenda, or register and attend one of the best ClickZ events yet!

ABOUT THE AUTHOR

P.J. Fusco

P.J. Fusco has been working in the Internet industry since 1996 when she developed her first SEM service while acting as general manager for a regional ISP. She was the SEO manager for Jupitermedia and has performed as the SEM manager for an international health and beauty dot-com corporation generating more than $1 billion a year in e-commerce sales. Today, she is director for natural search for Netconcepts, a cutting-edge SEO firm with offices in Madison, WI, and Auckland, New Zealand.

COMMENTSCommenting policy

comments powered by Disqus

Get the ClickZ Search newsletter delivered to you. Subscribe today!

COMMENTS

UPCOMING EVENTS

UPCOMING TRAINING

Featured White Papers

Google My Business Listings Demystified

Google My Business Listings Demystified
To help brands control how they appear online, Google has developed a new offering: Google My Business Locations. This whitepaper helps marketers understand how to use this powerful new tool.

5 Ways to Personalize Beyond the Subject Line

5 Ways to Personalize Beyond the Subject Line
82 percent of shoppers say they would buy more items from a brand if the emails they sent were more personalized. This white paper offer five tactics that will personalize your email beyond the subject line and drive real business growth.

WEBINARS

Resources

Jobs

    • Executive Assistant
      Executive Assistant (Agora Inc. ) - BaltimoreAgora Inc., an international publishing company, located in the Mt. Vernon district of Baltimore, MD...
    • Paid Search Specialist
      Paid Search Specialist (Boathouse, Inc.) - Waltham  Boathouse is looking for a Paid Search Specialist to work as a part of the Digital Acquisition...
    • Paid Search / Search Engine Marketing (SEM, PPC) Specialist
      Paid Search / Search Engine Marketing (SEM, PPC) Specialist (HeBS Digital) - New YorkJOB TITLE:     Paid Search / Search Engine Marketing...