A programmer friend recently contacted me about a site with search traffic that had nosedived. He works on the backend, and has little impact on SEO (define). But his boss was feeling the pain, creating a ripple effect through the company. I posed several questions, which he forwarded to his boss. The boss couldn’t answer any of them quickly, which made me think about how important it is to have a good understanding of some basic organic search benchmarks, even during periods of high search traffic. In the unfortunate event that organic search traffic declines from its norm, a clear, multidimensional definition of normal is a critical component when diagnosing and fixing the problem.
Following are the topics I inquired about, along with some additional explanation:
- Significant changes made to the site — with dates. My friend’s boss said no major changes had been made to the site in the last 48 hours. That’s not where I would have looked. It’s highly unlikely the engines would have crawled, processed, and re-ranked the majority of a site’s content in that timeframe. I was more concerned with significant changes to internal linking, URL structure, un-redirected URL changes, and major additions of new pages within the past four weeks or so. Whether these changes actually cause a traffic drop is very complex and a topic for another column, but you should keep a log of all major site changes together with implementation dates.
- Sample rankings for large and small traffic producers. Many site owners already know their money phrases; the terms that consistently produce organic search traffic. Without running position reports (which are a waste of time), have an idea of your rankings for a handful of terms, along with the associated search traffic they produce. Run these queries manually, because looking at a real SERP (define) tells you much more than a position report, such as who else is on the page, whether Google uses any sort of OneBox for such queries, and whether any new sites recently emerged as contenders for your terms.
- Total number of referring keywords from search engines. Not too many SEO professionals discuss this number, but I watch it closely. The specific number of referring phrases is a terrific way to measure the quality of your content and its ability to rake in long-tail queries. Keep a running monthly total of how many specific phrases bring in search traffic.
- Entry pages. Somewhat similar to referring keywords is the total number of pages used as entry points from search engines. This is another great way to measure how content quality matches up with user keyword behavior. It can also indicate how quickly new content is crawled and indexed.
- Index counts. Always have an idea of how many of your site pages the major engines know about. Compare this number to the number of pages your site truly has. Watch monthly to see whether those numbers get closer or further apart.
Also scroll through your Google index count SERPs to see if and when your supplemental results begin. While few people agree about exactly what characteristics subjugate a page to the supplemental index, most agree pages in it have a more difficult time ranking for competitive phrases.
Additionally, some tests, such as “site:yourdomainhere.com *** -view,” allegedly filter out all but supplemental results. This is contentious, however, because I’ve been able to show instances of URLs that inconsistently show the supplemental label.
- Inbound link counts. Yahoo Site Explorer and Google Webmaster Tools have excellent reports showing inbound links to your site, from both internal and external links, all the way to the page level. Download these tables once a month or so, and watch the number of links pointing to your domains.
- Organic traffic. The gold standard of organic search metrics is, of course, organic traffic. Know raw search referral numbers, as well as roughly what percentage of overall traffic comes from search. This could vary wildly day to day, but over several months trends should emerge.
Also monitor engine traffic ratios. While no industries are represented the same way across engines, a well-optimized site will generally pull in at least twice as much search traffic from Google as from Yahoo, and twice as much traffic from Yahoo as from MSN (4:2:1). At the other extreme, a Google:Yahoo:Microsoft ratio of 10:3:1 wouldn’t surprise me. Keep a rough estimated ratio in mind as you watch monthly traffic.
Even if you haven’t been watching these numbers on a monthly basis, many of these metrics are available with a decent Web analytics program. Others aren’t, however, such as index counts, sample rankings, and inbound link counts. Start monitoring these now. Don’t obsess, but do jot down the numbers for future reference.
Next, I’ll revisit these metrics and offer some guidance about what might be going on when specific numbers are dropping.
Join us for Search Engine Strategies April 10-13 at the Hilton New York in New York City.
Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.
Online consumers with intent to purchase only find what they’re looking for in 50% of ecommerce searches. That needs to change. eBay ... read more
Update: Google’s Rudy Galfi, Google’s lead product manager for AMP, has revealed to Greg Sterling from Search Engine Land that the global rollout of ... read more
Three years ago, Mark Knowles wrote a thorough checklist for testing a website prior to its live launch. It was a very ... read more
Sridhar Ramaswamy, Google’s SVP of Ads & Commerce made announcements about two new products this morning at DMEXCO 2016. The first centred on ... read more