Now we’ll look at the remaining metrics and suggest ways to correct the traffic drops that accompany their decline.
Monitoring index counts on a daily basis can cause a great deal of stress and anxiety. Even when site performance is normal, search engine index counts can fluctuate wildly. As with the other metrics I’ve discussed, I would consider a sharp drop in index count a concern only when it accompanies a drop in traffic, and then, only if you can assume beyond a reasonable doubt the index count drop is causal, not coincidental, to the traffic drop. In other words, if your index count drops by 30 percent at Google, but the newly-missing pages never brought in search traffic anyway, plummeting index counts aren’t the source of your traffic problem.
That said, index count drops are particularly vexing because without an encyclopedic knowledge of your site’s content and a solid “theoretical” index count in your head (i.e. the number of pages your site “should” have, based on your calculations), it can be hard to know exactly which pages were previously indexed but no longer are.
Canonical filtering is a possible reason behind an index count drop. If you have pages indexed under both the www and non-www version of your site, make sure one redirects (via 301) to the other. In the short term, this can halve your index count overnight and cause a brief lag in search traffic (especially if pages from both the www and non-www version are performing well). In the long term, your site is stronger for it.
If you can identify particular “veins” of pages that dropped out of the index, consider altering your site’s navigational structure to ensure crawlers have more opportunities to find the missing pages. In addition, examine the content of pages that dropped from the index to see if it’s similar to other content, either on your site or on others. This is a particularly difficult situation for catalog sites, as their primary content consists of displaying manufacturer-generated sales copy. Overall, unique, user-focused content with multiple, engine-friendly (i.e., href-based) links to it (both inbound and possibly from third-party sites) shouldn’t have problems being indexed.
Last week, the major search engines sponsoring Sitemaps.org provided users with an additional way to show their content to crawlers: By merely adding the URL of your XML sitemap feed to your robots.txt file, engines will now automatically discover its location. With the recent addition of Ask.com to the list of participating engines, an XML sitemap is critical for sites whose pages aren’t fully indexed with a regular crawl.
Inbound Link Counts
Each engine measures (and reports) links differently, so expect constant minor fluctuation and numbers that disagree from engine to engine. The disappearance of 500 links from a single dynamic site probably may have more to say about how an engine sees that site’s pages than how it measures links to your site.
If traffic takes a hit and you believe a corresponding drop in inbound links is related, it could mean engines aren’t recognizing certain pages, in this case, either pages that used to link to you, or the pages on your site that are linked to. In either case, check to see if those URLs are still indexed.
Another point seems simple, but is worth mentioning: If external sites link to a page on your site that no longer exists, you no longer benefit from any link popularity. Before you remove a page, make sure to implement a 301 redirect to its new version, even if only one external link points to it. (And if you’re pulling a page that has no new version, redirect it to the root.)
Also, remember search engines typically show links of all sorts, including links that help your site, links from bad neighborhoods, and “nofollow” links from social bookmarking sites and blog comments. Only the disappearance of high-quality, non-nofollowed links should concern you. If sites no longer link to you for what appears to be an editorial reason, find out why. The site may recently have been redesigned, and external links were dropped by mistake. In such a case, contact the other site’s Webmaster.
Another possibility is you might link to sites that make others uneasy about linking to you. Therefore, link only to high-quality, relevant sites. Make sure your site is just as relevant.
All the points above are subsets of organic traffic measurement, so there’s little more to say about drops in overall organic traffic. But some generalizations are helpful:
On February 28, 2017, ClickZ presented the webinar 'Still using .com? Here’s why 50% of all Fortune 500 companies are about to use .brand' in association with Neustar.
In part one a few weeks ago, we discussed what brand TLDs (top level domains) are, which brands are applying for them and why they might be important. Today, we’ll take an in-depth look at the potential benefits for brands, and explore the challenges brand TLDs could help solve.
In 2017 it is essential that SEO professionals secure the buy-in they need from their business leaders so they can accomplish their professional goals.
Google is giving advertisers new ways to target users on YouTube.