This year saw a major new update to the Google algorithm. The update was originally called the Farmer update, but was later renamed the Panda update by Google itself. Google estimates suggested that as much as 12 percent of search queries would be impacted by the change. After testing and tweaking the algorithm over a couple of months, the update was then rolled out to Google.ca in early April 2011. Subsequent improvements on the update have now been made impacting an additional 2 percent of search queries.
Why Did Google Make the Update?
According to Google, “This update is designed to reduce rankings for low-quality sites-sites which are low-value add for users, copy content from other websites or sites that are just not very useful.”
The key to success for any search engine is in its ability to deliver searchers to the information they seek, faster and more consistently than any other available options. Spam and low-quality content are an impediment to this objective, as they increase the noise ratio within the search results. Anyone having spent any time searching via Google (or any search engine for that matter) will almost certainly have noticed the search results peppered with pages from sites offering little or no value at all. Many of the sites offer “thin content” or content scraped and assembled from other sites. In a nutshell…they offer no real value-added content.
So What Really Changed?
These “low value” pages are thus the target of Google’s Panda update. We can never be 100 percent certain, but observations suggest that a few tell-tale signs associated with poor-quality content include elements such as excessive ads, too little content on pages, little or no links to pages, duplicate content, and user behavior data such as refined searches and block site requests. Google even admits that it cross-referenced its initial findings (i.e., those sites identified as providing low-quality content) with the results of the newly launched Chrome Site Blocker, and that it’s now using a little known new feature that permits searchers to block all results from certain sites.
How to Recover if You Were Impacted
If you’ve been affected by this Panda update, you’ll have noticed a drop in organic traffic from Google, and possibly even a more disturbing drop in rankings for most all terms you were previously ranking for. The question then becomes, is there any way to recover these rankings?
The reality is, there have been very few reported cases of sites recovering from the impacts of this update, and none can be specifically attributed to efforts they’ve undertaken to recover. Google however has suggested that companies affected can take the following actions:
- Add quality unique content.
- Move questionable content to another domain or prevent that content from being indexed by search engines.
How to Protect Yourself Going Forward
With this update, Google has made it known that its going to continue detecting and eliminating low-quality content from its search results…and it’s only going to get better at it. This has implications for all going forward, resulting in some “best practices” to protect your site in the future:
- Build good quality, unique content that actually adds value. If you’re a reseller, then add information beyond what the manufacturer provides. Get really good at soliciting client user-generated content.
- Do not try to generate large volumes of content that is low quality, as this will jeopardize the rankings of the good quality content.
- Get good links to each of your pages of content, as this is a signal to Google that content is quality.
- Ensure content is social media-optimized, meaning that sharing buttons are prominently displayed on each page to encourage sharing of the content, to increase the likelihood of submission to social media sites.
- Be wary of too many ads (e.g., Google AdSense) on your pages…again, a signal of poor-quality content.
- Make sure users extract some value from each page and won’t wish to “block” the site from appearing in future search results.
Automated content is not the issue…so long as it’s of sufficient quality to satisfy users. The issue is low-quality content that does not provide users with the information they’re searching for. The message from Google is clear: if you’re going to add content to your site, make sure its quality content that will help searchers find answers to their questions as quickly as possible. Those who ignore this message will likely perish from the search results. Those who heed the message have the best chance of succeeding long term. The key takeaway then becomes “how can businesses produce quality content as efficiently as possible?”…but of course this is the subject of another article.
This column was originally published in SES Magazine, May 2011.
In an often fragmented workplace, where various departments have varying opinions and goals, it can be challenging to get everyone on the same page and make strategy meetings productive.
In part one a few weeks ago, we discussed what brand TLDs (top level domains) are, which brands are applying for them and why they might be important. Today, we’ll take an in-depth look at the potential benefits for brands, and explore the challenges brand TLDs could help solve.
According to a report, references to hashtags appeared in just 30% of Super Bowl 51's commercials this year, down from 45% a year ago.
The explosive growth of video in 2016 makes 2017 an important year for video content and as more publishers are tempted to use it, it’s useful to consider the best strategies to maximise its effectiveness.