News flash! Humans are taking over the Internet! Their weapons are user-generated content and an overlay of human enhancements on top of existing content. An increasing use of social search, combined with the explosion of user-generated content, is changing the Internet’s balance of power yet again.
Search engines have always relied on humans to improve result relevancy. Yahoo started off as a directory. It still employs people to review PPC (define) ad creative before allowing it to go live (after the Panama launch, and in some cases now, there’s not a 100 percent editorial review, rather more of a spot check). Google continues to rely on the user-generated DMOZ open directory project, even though most of the online advertising and SEM industry feel DMOZ is in a state of disarray and needs an overhaul. This is old news.
The resurgence of humans taking over the Internet, and search along with it, is occurring in several areas, particularly social search, social networking, and blogs. Social search is primarily considered an organic SEO (define) phenomenon. Tagging and recommendations become a big relevance factor. However, two major problems arise. The first is that spamming is possible. The second is the self-fulfilling prophecy wherein top results are voted on and recommended more, thus locking them in. Blogs operate in a similar way, with links and comments combining with trackbacks (define) to create a voting system that takes advantage of the search engine’s link-centric relevancy algorithms.
However, some of the voting and tagging information could easily find its way into Google’s quality score, given Google’s focus on relevance. Google knows which paid ad listings result in a return to the original SERP (define) and which result in a sticky site visit (defined as the length of time before either returning to the SERP or engaging in another search activity). I wouldn’t be surprised to see this information become one of the many quality score components.
Social networking, on the other hand, will have a huge impact on PPC advertising, particularly contextual and behavioral inventory (which technically isn’t PPC search). Already, social networking sites such as MySpace.com and Facebook are in the top 10 most-visited Internet properties. Google recently locked in a deal to become the exclusive provider of both search and text-based contextual advertising for MySpace. Facebook signed an ad deal with MSN but is also rumored to be in discussions with Yahoo for a $1 billion sale. Regardless of whose network the ad inventory ends up in, as marketers we need to understand if that inventory is right for us and if we’re getting that inventory in our PPC clickstream.
To tell where a click originates from, Web analytics, tracking, or campaign management technologies rely on something called the HTTP referrer. Most clicks still have an HTTP referrer. Missing referrers can be caused by a multitude of factors, including the situation where the visitor navigated directly to a page through a bookmark or by directly typing in a URL. Some firewall software strips out the referrer because sometimes there’s a security risk caused by a link from a page that includes personally identifiable information in the URL. This becomes visible to the next site visited. Some spyware and adware have no referrer because the click didn’t originate from a Web page. But most browsers and computers allow passing of the HTTP referrer. By conducting an analysis of PPC traffic, you’ll be able to determine if clicks originating at MySpace and the other social networking site are better or worse than other clicks you buy.
Many in the industry believe that as search engine contextual and behavioral traffic grows as a percentage of overall traffic, the engines will have to provide marketers with better control in regard to sites. An opt-in system that separates the top 20 sites within a network would provide a high level of control for sites delivering a material level of traffic but add significant complexity to an already complex marketplace. My suggestion to several of the top search engines is to at least provide an opt-out option for domains one doesn’t want to buy traffic from. That way, at least one could minimize future damage to a campaign from sites that delivered traffic that isn’t working well without having to entirely shut down a campaign segment.
Here’s an example of how an opt-out system might work. If I sell guitar amps and electric guitars and want to tap Google’s contextual networks, there may be a ton of really great sites and blogs. But my ads may also show up on MySpace, where there are highly interested readers who actually may want my product, but who are 15 years old, who have no credit card, and whose parents aren’t going to drop $500 on a Fender amp. By opting out of MySpace traffic, I’d be left with a smaller Google network, one that performs sufficiently well to complement my search campaign. Similarly, I may decide LinkedIn traffic works particularly well, so I buy it through the Google site-targeting option, bidding a CPM (define) for impressions.
It’s too early to determine exactly how the search engines will address the differing kinds of traffic that originate from sites within their networks. It will take more than a smart-pricing approach similar to what Google’s using now to discount clicks from certain sources.
In the spirit of experimenting with social networking, I invite you, my readers who may have met me or know me, to link to me in LinkedIn.
Join us for Search Engine Strategies Financial in London, October 30-31, at the Hilton London Tower Bridge.
Want more search information? ClickZ SEM Archives contain all our search columns, organized by topic.
SEO and search marketing are a vital part of any marketing strategy, linking together channels like social media, content marketing and offline advertising.
There is of course a lot of discussion about content and what does and doesn't work online. Is long-form the key? Does short-form content have a role to play? Are there other factors at play?