Every year or so, I look closely at SEM (define) best practices for launching a redesigned site. Over the last four or five years, much of the advice has stayed the same. Today, however, I discuss some techniques that may have changed since the last time you launched a site update.
Rapid Crawling and Indexing
Launching a redesigned site can create a bumpy ride as old content moves to different locations. Assuming most of your pages are complete and ready for prime time, expediting the crawling and indexing of your new pages is the most important SEM-related issue to execute efficiently.
One easy way to do this is to include your new URLs in your XML site map feed the minute the new site is live. That’s not new. But for over a year, Google has offered exceptionally flexible XML site map submission on which few sites seem to capitalize.
You can mix and match content in your site map files to make it much easier to create and manage the feeds. Suppose, for example, that your redesign takes several different groups of pages that used to reside on your www subdomain and distributes them among four new subdomains. In the old days, you would have to verify each of those subdomains through Webmaster Tools, then place a site map file on the root of each of those subdomains.
You still need to verify all the subdomains, but you now have the option of putting all five subdomains’ pages into a single XML feed and placing the file on any of the five subdomains.
Or, if it’s more convenient, you can still create five different XML feeds and submit all of them through a single site’s Webmaster Tools area.
The official sitemaps.org site, however, clearly states that “all URLs listed in the Sitemap must reside on the same host as the Sitemap. For instance, if the Sitemap is located at http://www.example.com/sitemap.xml, it can’t include URLs from http://subdomain.example.com.”
Consequently, it’s safe to assume that for now at least, only Google honors these special tactics. Yahoo and Live Search, however, both honor the standard robots.txt autodiscovery technique.
Prioritizing URLs for Redirection
Let’s assume for argument’s sake that your URLs must change, due to platform constraints or some other technical issues. In a perfect redesign scenario, your URLs would already be short and user-friendly and wouldn’t need to change, and unicorns with cotton-candy manes would write your titles and meta descriptions.
If you have a site with thousands of pages, it’s not always technically feasible to redirect each one to its new counterpart. If you have limited resources to devote to deciding what URLs merit redirection, I typically recommend two distinct tests to decide.
First, measure the most popular entry pages. Many sites have an 80/20 or 90/10 system in which a large percentage of traffic enters the site via a relatively small percentage of the site’s URLs. And by “entry pages,” I mean entry points from all sources, not just organic search. When it comes to choosing important URLs, each entry point is equally important.
A supplementary method for choosing URLs worthy of redirection is to use a system such as Yahoo Site Explorer or Google Webmaster Tools to discover which of your site’s URLs have the most incoming links. These are likely the URLs by which the most traffic enters. Redirection is important for these URLs for the added reason that they likely have the most authority to pass to additional, deeper pages on your site.
Typically, these two tests will create lists of URLs that are very similar. In addition, don’t forget your site’s HTML site map. It will likely show up under neither scenario above, but it’s one of the most important URLs on your site.
When Two Wrongs Make a Right
Occasionally, so many things go wrong in a redesign (usually because of rigid launch schedules) that some positive things actually emerge. I’ve seen several instances of sites that had no time to properly redirect the vast majority of its URLs, coupled with a new 404 error (define) page that erroneously produced a 302 redirect (define) to an error page that showed a 200 code (define).
Neither of these examples is ideal, but together they bought the client time. Under ideal circumstances, the site’s old URLs would produce a 404 error after relaunch and drop out of the index. The faulty 302 error page, however, gave them extended life, since engines were technically informed that the pages’ content was temporarily located elsewhere (in this case, at an error page). Once the redirects were implemented and the 302 redirect changed to a 301 (define), the URL switch went relatively seamlessly.
This sort of serendipity is rare. I don’t encourage developers to purposely introduce errors to make similar findings.
While some techniques are new or recently updated, the SEM fundamentals behind relaunch remain the same: provide engines a way to find the new content as quickly as possible, and associate old and new content when applicable. This won’t ensure a relaunch period that is completely error-free, but it will minimize any issues you may encounter while engines process your new content.
Join us for Search Engine Strategies London February 17-20 at the Business Design Centre in Islington. Don’t miss the definitive event for U.K. and European marketers, corporate decision makers, webmasters and search marketing specialists!
How do Facebook’s ads drive search traffic?
Is the solicitation of SMBs by automated robocallers a threat to Google's advertising revenue? How can the search giant protect itself?
Consumer behavior is more predictable around the New Year, when resolutions about self-improvement are especially top-of-mind. But are marketers targeting these opportunities effectively?
Understanding the reciprocative relationship between search and content marketing will help brands effectively target and engage with consumers across multiple digital channels.