Over the past three weeks, I’ve read just about everything possible about how Google no longer treats the “nofollow” link attribute as it originally did. The nofollow tag was initially designed to tell search engines not to pass authority to a link. The kicker is that Google had stopped that behavior over a year ago and no one noticed.
This revelation unleashed the honking of a gaggle of SEO (define) practitioners, all thrilled at how wise they’d been all along to avoid such an implementation.
A little later, I’ll go into why SEO practitioners might have used that technique. In the meantime, I thought of an imperfect analogy to explain why the focus of this issue should be Google, not the Webmasters that listened to them.
Link Weight = Homes for Kittens?
Suppose, for example, that I opened a store. And when it opened, I told potential customers that I would donate a certain portion of each purchase to help homeless kittens find homes.
Well, people went crazy, and they bought a ton of stuff, because they really wanted to help those little kitties.
A year later, at a conference, I sat on a panel about homeless kittens and how to house them. Various techniques were bandied about, and someone reminded the crowd that any purchases from my store would go to help them.
“Well,” I said, “Not exactly. You see, about a year ago, I actually stopped donating to the homeless kittens.”
Naturally, the room went nuts. I had, after all, said that I would apply a certain amount of resources toward a certain goal, and without notice, I had discontinued that.
I felt like my defense was sound: “For crying out loud, look at all the streets! I assumed you all noticed all the homeless kittens still living out there, which should have told you that I’d stopped contributing!”
They weren’t buying it. Stupid kitten lovers.
My Own Experience with Nofollow
As an example of how many sites implemented nofollow, I’ll use one client from 2008 for whom an internal nofollow strategy seemed to make sense. The client is an aggregator of consumer electronics, similar to Amazon or PriceGrabber.
My nofollow strategy for this client suggested that it might be an effective method of controlling duplicate pages created when users sorted the same products by price, customer rating, and so on. A competitive analysis showed that in this specific vertical, some of the successful sites were using a similar strategy and some weren’t. Similarly, of the less successful sites in that vertical, there was a mix of sites that did and didn’t use the tag.
Still, however, I recommended proceeding with the strategy, and here’s my quote from the executive summary of the report:
We believe it’s too early to tell whether these techniques are fully or even partially responsible for competitors’ successes (or failures). We have made some recommendations concerning use of “nofollow”, robots, and meta tag exclusion in this document, and these recommendations complement other architectural recommendations that we’ve used successfully for years. We believe this combination is optimal for [client].
What’s critical here is that nofollow component was part of a 31-page document that discussed, in addition to duplicate content issues, about 25 other aspects of the client’s site. And they’ve implemented the suggested changes, and they’re quite happy with it.
Isolated, single-variable client-based testing is great, but it’s extremely rare in the wild. And with engines tuning their algorithms dozens or hundreds of times per year, even when you have time to test a single technique at a time, you can’t always be 100 percent sure that the technique itself is responsible for any changes that you see.
That’s one reason that many SEOs may not have realized the impotence of nofollow recently. Real-world SEO frequently involves taking everything “good” you have and applying it to the client site, all at once.
More Than a Single Quote
But back to the point. You might wonder, given all the recent mockery of SEOs who employed the technique, what exactly made me think the “nofollow” usage was something for the “good” column.
Most people think that SEOs derived their justification for nofollow usage based on an obscure Matt Cutts quote about “link-level granularity.” Actually, there’s quite a bit more justification, such as the transcript of this Google Webmaster chat from March 2008, which discusses the exact usage I discussed with my electronics client:
Q: Would you recommend nofollowing the sorting links?
A: If those links lead to duplicate content, you can either add a nofollow to the links or block indexing through meta tags or a robots.txt entry.
Another recommendation for this technique came a few months later in this August 2008 post from the Webmaster Central blog:
Another common scenario is websites which provide for filtering a set of search results in many ways. A shopping site might allow for finding clothing items by filtering on category, price, color, brand, style, etc. The number of possible combinations of filters can grow exponentially. This can produce thousands of URLs, all finding some subset of the items sold. This may be convenient for your users, but is not so helpful for the Googlebot, which just wants to find everything – once!
The post first recommends a robots.txt exclusion, then adds:
Another option is to block those problematic links with a “nofollow” link attribute. If you’d like more information on “nofollow” links, check out the Webmaster Help Center.
To suggest that SEOs took a single quote and extrapolated it to suit their own PageRank-sculpting desires is simply incorrect.
The SEO peanut gallery is a two-part chorus. When things appear to go south, the melody tells you that you should have listened to Google and followed their guidelines. Recently, the harmony chimed in, asking how you could have been dumb enough to listen to Google. You can’t please everyone, so just keep pleasing your clients. And if you get a chance, please think of the kittens.
How do Facebook’s ads drive search traffic?
Is the solicitation of SMBs by automated robocallers a threat to Google's advertising revenue? How can the search giant protect itself?
Consumer behavior is more predictable around the New Year, when resolutions about self-improvement are especially top-of-mind. But are marketers targeting these opportunities effectively?
Understanding the reciprocative relationship between search and content marketing will help brands effectively target and engage with consumers across multiple digital channels.