Search engine marketers have plenty to worry about. Just like we can never be too rich or too thin, our sites can’t be too optimized, content too prolific, or high-quality incoming links too numerous. But some problems are better left untreated, because they’re not problems at all. Following is a list of maladies whose effects live only in our minds and seldom harm our search performance.
Terrible Sites Are Linking to Mine
It’s a little disconcerting to look at your site’s incoming links and see site after site of little redeeming social value. It’s even scarier that these sites frequently scrape little pieces of your site (along with content from other sites) to build a unique blend of content, typically focused around a specific topic.
Fortunately you have little to fear from such sites, because algorithms are pretty good at identifying them and stripping their ability to perform well. Engines long ago realized that while links are an algorithmic economy’s currency, you can’t control who links to your site. Consequently, don’t expect to be punished based on who links to you. Anything else would be chaos, because competitors could hurt each other’s sites with impunity by purchasing nefarious links to them.
(I wrote about one rare but potentially harmful exception to this rule last month. If your on-site search function is set up improperly, some sites might link to you to make them more popular.)
In terms of links, worry instead about sites you link to. If you link to sites that link to nasty link neighborhoods, you’re more likely to be guilty by association.
Something Is Wrong With My Site Map
Whether we’re talking about an on-site HTML site map or an XML site-map feed, it’s tempting to blame the file itself if engines aren’t crawling and indexing your site’s deep pages. Chances are, however, it’s not the site map’s fault. In my experience, the main obstacles to significant deep indexing are navigational problems and a shortage of links to deeper portions of your site.
Navigational issues can include script-based navigation menus, Flash navigation, dynamic links with too many parameters, and links that impose session IDs on URLs. Each is an impediment to crawlers, which often either can’t penetrate the code deeply enough to find a URL or simply choose to abandon the process altogether.
Remember, the purpose of an XML site-map feed is to supplement, not replace, a coherent on-site navigation structure. If engines can’t eventually find pages by crawling your site, don’t expect them to index them simply because they exist in your XML feed.
My Pages Have No PageRank
While PageRank is just one of a hundred or more factors that help determine how a site ranks at Google, it’s the only one with its own button on the Google Toolbar. Consequently, people often assign it more weight than is probably wise. (Many seasoned search marketers advocate ignoring the bar altogether, which is fine, although I believe it’s possible to glean some good information from it.)
People sometimes panic when they look at the Toolbar PageRank (TPR) and their sites show solid gray or white where they expect to see green. This reflects good intentions, but it’s time poorly spent. Typically, Google updates its public display of PageRank only about every three to six months. Second, depending on the query and the content of two different pages, a page with a toolbar PageRank of zero can easily outrank a page with a higher PageRank.
Determine instead whether your page is indexed, not whether it has toolbar PageRank, because only an indexed URL can perform for a query, regardless of its PageRank. The fastest way to see if a URL is indexed is to search for it in the Search box of any major engine. At Google, you’ll need to precede the URL with “info:” (no quotes) and with no space between the colon and the URL (e.g., info:www.clickz.com).
My Site Doesn’t Validate
Many SEM (define) firms preach the gospel of validation, testifying that without valid W3C-compliant code, your site will wither under search engine scrutiny. That advice isn’t complete nonsense, but don’t take it at face value either.
An engine just needs to see the copy and crawl the links on the page. How can you tell if the engine can see copy and links? Check the engine’s cached version of the page and assess whether the text and links show up appropriately.
As an online marketer, your to-do list is full enough without having to chase down ghosts. Don’t be fooled into thinking that everything is a problem just because it doesn’t meet an arbitrary benchmark of success.
Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.
If you’re an automaker or car dealer, 2016 should be a good year for sales. Intensive research conducted by my data scientist ... read more
Last Friday at a packed-out Brighton SEO conference, expert local search consultant Greg Gifford delivered a fast and furious presentation on the secrets ... read more
Google’s official slogan is “Don’t Be Evil”, but it’s long been rumoured that the company has a second, internal motto that they ... read more
A report by Ofcom has found that just 60% of adults can realise that PPC ads in search results are in fact ... read more