A lot of industry jargon is inaccurate or flat-out wrong.
There's something wrong with much of the language we use in the SEO (define) industry. Take the ubiquitous industry misnomer itself: "search engine optimization." Who in this industry has ever optimized a search engine, other than a search engine? And there's "SEM" (define). Who has ever marketed a search engine, other than a search engine?
Certainly, we optimize Web pages to ensure they can be more easily crawled and indexed by search engines. And we buy advertising at search engines to promote our clients' products and services. But our industry sector titles don't seem to properly describe those exact practices.
Here's another phrase that gets bandied about a lot: "reverse-engineer" (define). A lot of potential clients ask me if I can reverse-engineer the algorithm the same way their previous or existing firm does.
Generally speaking, you reverse-engineer something to recreate it in some form or fashion. It's all about analyzing systems and components and their interrelationships to come up with something else that does the same thing.
Correct me if I'm wrong, but no one in this industry has reverse-engineered a search engine algorithm and recreated it. If someone had scientifically reverse-engineered Google's algorithm, all 100 or more factors and elements, and now claims to know exactly what the "secret sauce" for guaranteed ranking is, we'd hear about it.
Certainly, many in the industry are obsessed with search engine crawler activity. But you only need a basic log-file analyzer to get a handle on that. It's not difficult at all to figure out what a crawler does and recreate it. As I did a number of years ago, you can read Junghoo Cho's thesis on the subject. In fact, go ahead and build yourself a crawler; it's not too difficult if you're a programmer. Just read this book.
However, this is hardly reverse-engineering an algorithm. It's simply identifying an active component in the data collection and indexing process.
When it comes to the ranking element of the algorithm, there are many factors that can't be identified because we simply don't have access to the pertinent information. The number and quality of links that point to your site are a very important factor in the ranking algorithm. To do a back-link check, go to the search box at Google and type in "link:www.yourdomainname.com." You'll get a list of links that point to your site. You can do that to find out who links to your competitor's site as well.
To a search engine, this is known as "in-degree." But another important factor of in-degree is based on "co-citation," as it's known in social network analysis. The algorithm detects two sites that don't link to each other, but links from many other sites always mention those two close together. Check out these graphics to better understand the concept.
The point is, you can't get access to this information unless you have access to Google's entire database.
At the "Future of SEM" session at Search Engine Strategies in Chicago last week, expert search optimizer Greg Boser suggested he could see user behavior data becoming a much more important factor in the ranking algorithm. I couldn't agree more. In fact, I wrote a column that touched on the subject earlier this year.
And how do we get access to that data to factor into the so-called SEO reverse-engineering process? Only search engines have this data.
We may be able to discover components and clues as to what may be happening to provide a little anecdotal evidence in the odd SEO forum. But this is hardly scientifically stripping an algorithm to the core to recreate it.
More SEO Jargon
What about the "sandbox"? Who's idea was that? It's another nonissue that's reached almost hysteria level. And yet, to those businesses who launch sites based on well-thought-through business models, it doesn't exist. The sandbox only seems to affect sites that are of no interest to humans or search engines.
You don't get "sandboxed" for that. You get what you deserve -- nothing.
"Proprietary tools." That's another phrase that crops up all the time. Every SEO firm has its own proprietary tools. The fact that most SEO shops have the same tools that do the same things rarely seems to be mentioned.
We have a tool that reverse-engineers an algorithm by downloading thousands of Web pages and analyzing the text on them, a firm may say. This means it has a keyword-density analyzer. Pretty useless, but you can get one for yourself for $99.
What about "rank checking"? You can get a tool to do that for $389. Let's throw in a back-link and anchor-text analyzer. You can pick one up for just $167. And so it goes. For about a grand, you can pretty much have an SEO toolbox.
It's no wonder I get so much feedback from people telling me how hard it is to cut through the SEO hype pitched to them. Wouldn't it be nice if SEO clients could hear what they really want to hear, such as, "I've analyzed your business model and have a depth of knowledge about you and your market place."
Optimize, sandbox, reverse-engineer, and all the other industry jargon suggest there's a technical problem that needs to be solved. But more often, it's a business or marketing problem that needs to be addressed.
That's my final rant for this year. I sincerely hope you have a happy and peaceful time over the holiday.
Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.
Mike Grehan is currently chief marketing officer and managing director at Acronym, where he is responsible for directing thought leadership programs and cross-platform marketing initiatives, as well as developing new, innovative content marketing campaigns.
Prior to joining Acronym, Grehan was group publishing director at Incisive Media, publisher of Search Engine Watch and ClickZ, and producer of the SES international conference series. Previously, he worked as a search marketing consultant with a number of international agencies handling global clients such as SAP and Motorola. Recognized as a leading search marketing expert, Grehan came online in 1995 and is the author of numerous books and white papers on the subject and is currently in the process of writing his new book From Search to Social: Marketing to the Connected Consumer to be published by Wiley later in 2014.
In March 2010 he was elected to SEMPO's board of directors and after a year as vice president he then served two years as president and is now the current chairman.
US Consumer Device Preference Report
Traditionally desktops have shown to convert better than mobile devices however, 2015 might be a tipping point for mobile conversions! Download this report to find why mobile users are more important then ever.
E-Commerce Customer Lifecycle
Have you ever wondered what factors influence online spending or why shoppers abandon their cart? This data-rich infogram offers actionable insight into creating a more seamless online shopping experience across the multiple devices consumers are using.
October 13, 2015
1pm ET/ 10am PT
November 12, 2015