Home  › Search › SEO

Hit By Google Panda? Fixing These Site Issues Won't Help

  |  June 15, 2011   |  Comments

Eight site changes that don't appear to matter. But we're getting closer.

Search Engine Roundtable recently had a post that stated, as far as the author Barry Schwartz could determine, no one had fully recovered from the Google Panda updates. As an SEO, I find this quite disheartening. Not because Google's quest for quality isn't admirable, but rather because this particular update is supposed to be one you, as a site owner, could do something about (i.e., improve the quality of your content and traffic would return to pre-Panda levels).

Site Changes I've Tested

I happen to have a bunch of hobby sites where I test out ideas. These sites are all based on WordPress and so share a common core along with a similar design, both of which makes head-to-head testing somewhat easier. The commonalities mean that I can measure the impact of a change by applying it to one site and not to another. So when Panda rolled out and some of these sites were hit, I tried to figure out what tripped the filter(s). Unfortunately, while I'd like to report that I've identified what elements needed fixing, all I've got so far are a list of tactics that don't appear to matter.

Remove Q&A-style content: I had some pages that were in question-and-answer format, similar to Yahoo Answers. Compared to a coherent article by a single author, I figured this Q&A content might be flagged as low quality, so I removed it from the site.

Remove press releases: One site had republished hand-selected press releases that were all relevant to a particular topic. Not the usual company announcement fluff, but the research-style press releases like what you'd get from Forrester. Since these were clearly duplicated from other sites, removing them seemed like an important factor to test.

Eliminate duplicate content resulting from URL parameters: A plug-in I had installed a while back was creating URLs with parameters that triggered page functionality while not actually affecting the content. Just goes to show you that even after you've optimized your site, you need to keep an eye on it. In Google Webmaster Tools, I marked such parameters so that they'd be ignored.

Eliminate cross-site linking: I wasn't doing anything sneaky here - just promoting other sites from at most one page on any given site. I removed such promotion.

Fix broken links: I fixed broken internal and external links. Arguably, such links would result in a negative user experience, so correcting them should be seen as an improvement in quality.

Improve comment quality: I cleaned out low-quality comments such as those that were short or didn't actually say anything that wasn't previously said. I also corrected spelling and grammar.

Curiously, one of the sites that is very popular with visitors based on the thousands of comments it inspired was still flagged as low quality.

Extend content with comments: Content length doesn't automatically equate to quality, and comments themselves aren't necessarily high quality. However, some Panda-related commentary highlights the concept of a ratio of content to advertising. My testing included displaying zero comments, six comments, and 10 comments.

Block comment sub-pages: Beyond the first page of comments, all comment pages were set to noindex using the meta robots tag.

A Few Additional Observations

In addition to tweaking the sites, I also noticed a few things that I thought I'd share. Similar to the above changes, the items below existed on some sites and not on others. And their existence or absence didn't protect a site from the Panda update.

Facebook "likes": "Likes" in the hundreds for multiple pages on a single site weren't sufficient to save that site. Another disconnect between what users and Google consider to be quality content.

Images: Using images vs. not using them in content doesn't appear to have any impact. I've got sites heavy with images that were hit while others weren't.

Page load time: Using caching and a content delivery network, I've got initial page load time down to about 5 seconds as reported by WebPageTest.org. Once again, with some sites hit by Panda and others not, I can eliminate this page load time as a factor.


There are always caveats to any sort of SEO testing. First, there are hints from Google that the Panda update takes into account multiple signals. It could be that I've hit, for example, four important signals with the efforts described above, but that I need to hit five to cross back into Google's good graces.

Second, since there's no indication from Google about how long it takes for a site to be reassessed; it's still possible that my efforts will be successful in the future.

Third, a small number of sites doesn't necessarily provide statistical significance, especially when there remain multiple variables that I can't control.

What's Next?

So if nothing is working, why bother with this column? Since no one has really cracked the formula and shared their findings, I figured sharing what isn't working can help people zero in tactics that have yet to be tested.

One item that stood out for me between sites affected by Panda and sites that weren't is that the ones spared didn't have any advertising. So a test I just initiated is the complete removal of advertising. It'll be curious if this one remaining test turns out to be the winner, since it would mean Google Panda is essentially targeting sites using Google AdSense. That would be the web equivalent of shooting yourself in the foot, no?


ClickZ Live New York Want to learn more?
Attend ClickZ Live New York March 30 - April 1. With over 15 years' experience delivering industry-leading events, ClickZ Live brings together over 60 expert speakers to offer an action-packed, educationally-focused agenda covering all aspects of digital marketing. Register today!


Marios Alexandrou

Marios Alexandrou is the East Coast Director of SEO for Steak's Search Marketing team and has a background in web development and project management. While he loathes to tell people just how long he's been working with computers, he will admit that his first computer had just 16KB of memory.

His SEO experience includes work with both in-house and agency teams ranging from one-man shows to 20+ dedicated SEO strategists. He has worked with organizations of all sizes and across multiple industries including hospitality, financial services, publishing, and healthcare. He particularly likes to use his combination of skills to identify ways to scale SEO activities through process standardization and automation.

In addition to writing about SEO for ClickZ, Marios also writes on the broader area of Internet marketing for Infolific.

COMMENTSCommenting policy

comments powered by Disqus

Get the ClickZ Search newsletter delivered to you. Subscribe today!



Featured White Papers

A Buyer's Guide to Affiliate Management Software

A Buyer's Guide to Affiliate Management Software
Manage your performance marketing with the right solution. Choose a platform that will mutually empower advertisers and media partners!

Google My Business Listings Demystified

Google My Business Listings Demystified
To help brands control how they appear online, Google has developed a new offering: Google My Business Locations. This whitepaper helps marketers understand how to use this powerful new tool.



    • SEO Specialist
      SEO Specialist (Bankrate.com) - New YorkBankrate, Inc. operates a network of personal finance related web sites for consumers.  Our sites provide...
    • Search Manager
      Search Manager (LOYAL3) - San FranciscoThe Role: We are looking for a bright, driven and personable performance-based marketer to join LOYAL3’...
    • Product Manager - Contract position
      Product Manager - Contract position (Wiley Publishing) - HobokenThis position can be located in either Hoboken, NJ or Indianapolis, IN. We are...