This week, I’m in New York mentally downloading all the data I picked up at Search Engine Strategies (SES) San Jose. And if there’s one thing I picked up loud and clear, it was the word “signal.”
For years, we’ve been talking about links using terms such as “hubs and authorities,” “in degree,” and “connectivity data.” But these days, you’ll hear the search engine reps talk about links as a signal. During a link-building Q-and-A session, I must have heard the word “signal” about 100 times from the various panel members.
There was also a sense of déjÀ vu that I got when search engine reps talked about links. I remember conferences three or four years ago when the they were happy to downplay the importance of meta tags and on-page factors compared to the power of linkage data.
Now it appears they are downplaying the power of linkage data as just another (albeit important) signal among many others.
What’s a “Signal”?
In Chris Anderson’s must-read book, “The Long Tail,” there’s a section called “Is the long tail full of crap?” Here, he refers to the field of information theory and how it was built around the problem of pulling coherent signals from random electrical noise. This occurred first in radio broadcasts, then in any sort of electrical transmission.
The notion of a signal-to-noise ratio is now broadly used to refer to any instance where clearing away distraction is a challenge. In the long tail, noise is a huge problem. And search certainly has a very long tail. This is why it needs filters.
It’s also interesting how Anderson in that same section says Google taps into “the wisdom of the crowd itself and turns a mass of incoherence into the closest thing to an oracle the world has ever seen.”
Let’s be clear. At this time, links may just be another signal to a search engine. But currently, I find them to be a pretty loud, potent signal. Links send a number of important signals to a search engine. First, the number and quality of incoming links can signal your standing in the community, or your authoritativeness.
Link anchor text can often tell a search engine more of what a target page is about than the page itself. Of course, this depends on the nature and class of the end-user query.
A lot of research has been done on the relationship between anchor text and query terms. When anchor text terms match a query, the target document will usually be very relevant to the query terms. (I usually find there’s a relationship between anchor text and both query and title tag terms. The closer they’re related, the better a document seems to rank for certain query classes.)
Although links provide excellent signals, if linkage data remained permanently static in a search engine database they would regularly return incorrect results for certain queries. For instance, when a new president is elected, all the information on the White House Web site is updated to reflect this. But there’s still a whole lot of old president linkage data surrounding the site.
So another signal that a link needs to send to a search engine is related to temporal analysis, that is, how old or new a link is. Some pages become very popular, so they attract many links. Yet popularity is often dictated by trends and fashions that change over time.
This is a particular area I’m keen to discover more about. Generally speaking, you can perform a sort of temporal analysis of Web pages by simply monitoring the “last modified” HTTP header field. But it’s a lot more difficult to discover how relevant a link is to a specific topic a year after it was placed on a page.
I feel certain, though, that search engines can glean a lot more temporal analysis related to linkage data when it’s combined with the other signals they talk about, such as burstiness, end-user behavior data, social search, and personalization.
Even though search engines may have new signals to monitor, I still believe link-building is the most crucial area in SEO (define), particularly in highly competitive fields. Don’t deemphasise its importance for a little while yet.
On the subject of links and anchor text, I recommend a great tutorial from Dr. Edel Garcia, debunking many of the myths about latent semantic indexing (LSI).
And I’ve no idea how I missed this article from earlier this year, which I stumbled across recently. It’s a fascinating interview between my friend and former ClickZ columnist Fredrick Marckini and Stephen Arnold, one of the big brains in search, examining Google from a technical perspective.
Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.
Google has introduced new tools and features to AdWords to specifically address the consumer shift towards mobile.
Over the past couple of weeks, Google has been spotted introducing some interesting changes to the look and layout of its search ... read more
Today it was announced that the BBC Food recipe website is to close, or at least mothballed in some way. It contains ... read more