Do you remember the first time you used Google?
To be honest, this is really a question more for you old-timers than you young whippersnappers out there who never knew anything but Google. But for you veterans of the Search Engine Wars who came to Google after clicking through Yahoo, Excite, or (shudder!) AOL, I’m betting that your first experience with Google was probably the same as mine: wow!
It may be hard to believe, but it was only in December 2006 that Google passed Yahoo in terms of overall users. Since then…well…everyone knows the story.
But what was the “wow” factor that drew users away from Yahoo, which had dominated Internet search since the mid to late ’90s? (And yes, I know that Yahoo is often considered a “directory,” but for most folks, it served the same purpose.) For me, there were two reasons.
First, it was unbelievably refreshing to experience Google’s simplicity. Compared to the in-your-face-everything-you-might-ever-do-online high-fi pizza that Yahoo had become, using Google was a dream. You went there, typed in your search, and got search results. And up until 2000, you didn’t even have to contend with ads! Compare that to Yahoo with its busy layout, banner ads, and seemingly unlimited choices. Ugh. Google shined in a world where everything had become a “portal” screaming for our attention.
The other reason that users gravitated to Google was because it…worked. Its page-ranking technology that looked at link popularity was a major breakthrough (though others had tried similar schemes before), and it was sheer info-junkie joy to be able to type in a couple of words and get results that were actually relevant. As early as 1997, companies had begun to understand that they could “game” the earlier search engines to enhance their search rankings, mainly by including insane amounts of spam-y keywords in the text of the site. Google’s algorithm seemed immune to manipulation, and therefore seemed to deliver much more relevant results.
Today, of course, the idea that a search engine could be immune from manipulation seems like a dream to many (like those of us who actually need to find relevant information) and a nightmare to those who make their livings by manipulating search engine results in order to drive traffic to useful, content-free (but keyword- and link-rich) sites.
The result, as many of us have experienced, is that it’s getting harder and harder to find good content with Google…even for those of us in the biz who understand where these sites come from and what the difference between paid and “organic” search is. And it’s worse for most users: a 2007 study found that over 75 percent of users experienced enough “search engine fatigue” to drive them away from their computers without finding what they wanted.
Obviously, there’s a problem. In fact, it’s become so obvious that Google had to adjust its algorithm in February, though honestly I can’t say that I have personally seen much of an improvement in my search results. Spam seems to still rule, especially in consumer categories such as travel and appliances.
Are the content farmers doing anything? Oh, Demand Media has now put up a site discussing how “content matters” to it. But it still keeps advertising for freelancers to crank content in the same way. And even though it spends a lot of time discussing its “standards” on the new site, I can tell you from the experience that a close family member of mine had writing for the company, “quality” has a lot of different ways of being measured depending on desired outcome. Cough. Cough.
What’s a searcher to do?
Enter Blekko, a scrappy little startup search company that’s decided (much in the same way Google did way back when) to zag when everyone else in the industry is still zigging (even if they’re trying to alter their zigs a bit).
Blekko made a splash earlier in the month when it blocked over a million websites that it’d determined to be spammy content farms. And while Google’s tweaks have been algorithmic, Blekko reports that its decisions about blocking content are based on human input, not just how a computer interprets a site.
It’s a good move because humans are (for the time being) the real answer to search engine relevance. It can’t just be done algorithmically, because computers don’t do a very good job determining intention and real relevance…at least not right now. And until we get a sophisticated AI that can understand content at a level that approaches a human’s understanding, any attempt at pure algorithmic search is going to fall victim to smart folks who figure out how to game the system. Heck, even if we get that kind of AI, it may be hard to tell…just ask anyone who’s been “taken” by misleading content.
Even so, while it’s possible to fool some of the people some of the time, it’s pretty tough to fool everyone all the time. The genius of Blekko is that through its use of “slashtags,” built-in spam reporting mechanisms, and open approach to revealing its ranking methods, it’s able to apply crowdsourcing to search rankings, effectively distributing relevancy measures across all users. And while it works well now, this system means that the search results can only become more relevant as more people use the system. It’s a beautiful example of network effect feedback loops applied to solving a problem that many of us probably were beginning to fear was unsolvable.
Can Blekko be “gamed”? Sure…there’s no reason why someone couldn’t hire a big group of people (perhaps via a mechanism such as Amazon’s Mechanical Turk) to “like” spam sites in order to affect search results. I suspect, however, that the real humans who are monitoring content at Blekko might catch on.
Will Blekko be the next Google? It’s way too early to tell and it’s got a huge hill to climb to even get close to Google’s marketshare. Add to that Google’s dominance in mobile, Web apps, and just about everything else online, and it’s clear that Blekko’s got its work cut out for it.
Regardless of how Blekko fares in the future, the model that it is pioneering is one that we all need to watch. It’s about time humans got back in the mix again.
What are some of the major developments that are likely to shape multi-channel marketing in 2017?
Time is running out to feature your company in our inaugural Mobile Vendor Reader Survey.
Marketers create personas to better understand their target audience and what it looks like. If marketers can understand potential buyer behaviors, and where they spend their time online, then content can be targeted more effectively.
What’s behind a successful data-driven marketing strategy?