Some interesting things always come out at conferences, which is probably why so many folks still go to them, as opposed to some sort of virtual offering. It might be a big announcement from the search engines, an entertaining keynote speech, or a snippet of conversation over lunch, but some tidbit of information always makes you glad you went to the time and expense to participate.
One interesting thing that came out of SES San Jose's Duplicate Content and Multiple Site Issues session in August was the sheer volume of duplicate content on the Web. Ivan Davtchev, Yahoo's lead product manager for search relevance, said "more than 30 percent of the Web is made up of duplicate content."
At first I thought, "Wow! Three out of every 10 pages consist of duplicate content on the Web." My second thought was, "Sheesh, the Web is one tangled mess of equally irrelevant content." Small wonder trust and linkage play such significant roles in determining a domain's overall authority and consequent relevancy in the search engines.
Three Flavors of Bleh
Davtchev went on to explain three basic types of duplicate content:
Fortunately, Greg Grothaus from Google's search quality team had already addressed the duplicate content penalty myth, noting that Google "tries hard to index and show pages with distinct information."
It's common knowledge that Google uses a checksum-like method for initially filtering out replicated content. For example, most Web sites have a regular and print version of each article. Google only wants to serve up one copy of the content in its search results, which is predominately determined by linking prowess. Because most print-ready pages are dead-end URLs sans site navigation, it's relatively simply to equate which page Google prefers to serve up in its search results.
In exceptional cases of content duplication that Google perceives as an abusive attempt to manipulate rankings or deceive users, Google will "make appropriate adjustments" to the indexation and rankings of the sites involved, according to Grothaus. Even though Google doesn't consider dilution of link popularity a penalty, anyone who has been on the receiving end of this particular duplicate content issue might say otherwise.
Test and Tune
How do you know if duplicate content is an issue for your site? Simply run a couple of quick tests:
Because these are usually accidental duplicate content issues, remedies are relatively easy to apply to your site. Simply read and employ all the best practices delineated on the search engines Webmaster blogs and forums. Here are the big three:
If you make certain that you properly canonicalize your site, 301 (permanent) redirect any duplicate home page URLs to your canonical domain, use robots.txt to eliminate site level content duplication, use meta robots tags to purge page-level duplicate content, and use canonical tags to indicate preferred content, you can readily eliminate much of the duplicate content that was accidentally created.
It's really that simple to employ remedies for accidental content duplication. Build a site out of user-friendly URLs to optimize your branding efforts and usability while eliminating inefficient crawling, and your Web site will be well on its way to earning trust from the search engines.
If any of this is too complex for your circumstance, you might want to call in some professional assistance. If dodgy or abusive levels of duplicate content are issues for your Web site or network of sites, stop back here in a couple of weeks when we continue the conversation about duplicate content and the issues it creates in the search engines. Until then, keep testing and tuning your results.
Search ads and display ads offer a powerful one-two punch for your marketing plan. Join us on Wednesday, September 30, 2009, at 1 p.m., for a free Webinar to hear how recent studies show that search and display advertising used together can drive sales more effectively than either channel by itself.
Know your Ambiguous Customer: Effective Multi-Channel Tracking
Wednesday, June 5 at 1pm ET - Learn why a move from the "batch and blast" email approach enables better conversations with your customers.
Register today - don't miss this free webinar!
P.J. Fusco has been working in the Internet industry since 1996 when she developed her first SEM service while acting as general manager for a regional ISP. She was the SEO manager for Jupitermedia and has performed as the SEM manager for an international health and beauty dot-com corporation generating more than $1 billion a year in e-commerce sales. Today, she is director for natural search for Netconcepts, a cutting-edge SEO firm with offices in Madison, WI, and Auckland, New Zealand.
June 5, 2013
1:00pm ET / 10:00am PT
June 20, 2013
1:00pm ET / 10:00am PT