Search Engine Roundtable recently had a post that stated, as far as the author Barry Schwartz could determine, no one had fully recovered from the Google Panda updates. As an SEO, I find this quite disheartening. Not because Google’s quest for quality isn’t admirable, but rather because this particular update is supposed to be one you, as a site owner, could do something about (i.e., improve the quality of your content and traffic would return to pre-Panda levels).
Site Changes I’ve Tested
I happen to have a bunch of hobby sites where I test out ideas. These sites are all based on WordPress and so share a common core along with a similar design, both of which makes head-to-head testing somewhat easier. The commonalities mean that I can measure the impact of a change by applying it to one site and not to another. So when Panda rolled out and some of these sites were hit, I tried to figure out what tripped the filter(s). Unfortunately, while I’d like to report that I’ve identified what elements needed fixing, all I’ve got so far are a list of tactics that don’t appear to matter.
Remove Q&A-style content: I had some pages that were in question-and-answer format, similar to Yahoo Answers. Compared to a coherent article by a single author, I figured this Q&A content might be flagged as low quality, so I removed it from the site.
Remove press releases: One site had republished hand-selected press releases that were all relevant to a particular topic. Not the usual company announcement fluff, but the research-style press releases like what you’d get from Forrester. Since these were clearly duplicated from other sites, removing them seemed like an important factor to test.
Eliminate duplicate content resulting from URL parameters: A plug-in I had installed a while back was creating URLs with parameters that triggered page functionality while not actually affecting the content. Just goes to show you that even after you’ve optimized your site, you need to keep an eye on it. In Google Webmaster Tools, I marked such parameters so that they’d be ignored.
Eliminate cross-site linking: I wasn’t doing anything sneaky here – just promoting other sites from at most one page on any given site. I removed such promotion.
Fix broken links: I fixed broken internal and external links. Arguably, such links would result in a negative user experience, so correcting them should be seen as an improvement in quality.
Improve comment quality: I cleaned out low-quality comments such as those that were short or didn’t actually say anything that wasn’t previously said. I also corrected spelling and grammar.
Curiously, one of the sites that is very popular with visitors based on the thousands of comments it inspired was still flagged as low quality.
Extend content with comments: Content length doesn’t automatically equate to quality, and comments themselves aren’t necessarily high quality. However, some Panda-related commentary highlights the concept of a ratio of content to advertising. My testing included displaying zero comments, six comments, and 10 comments.
Block comment sub-pages: Beyond the first page of comments, all comment pages were set to noindex using the meta robots tag.
A Few Additional Observations
In addition to tweaking the sites, I also noticed a few things that I thought I’d share. Similar to the above changes, the items below existed on some sites and not on others. And their existence or absence didn’t protect a site from the Panda update.
Facebook “likes”: “Likes” in the hundreds for multiple pages on a single site weren’t sufficient to save that site. Another disconnect between what users and Google consider to be quality content.
Images: Using images vs. not using them in content doesn’t appear to have any impact. I’ve got sites heavy with images that were hit while others weren’t.
Page load time: Using caching and a content delivery network, I’ve got initial page load time down to about 5 seconds as reported by WebPageTest.org. Once again, with some sites hit by Panda and others not, I can eliminate this page load time as a factor.
There are always caveats to any sort of SEO testing. First, there are hints from Google that the Panda update takes into account multiple signals. It could be that I’ve hit, for example, four important signals with the efforts described above, but that I need to hit five to cross back into Google’s good graces.
Second, since there’s no indication from Google about how long it takes for a site to be reassessed; it’s still possible that my efforts will be successful in the future.
Third, a small number of sites doesn’t necessarily provide statistical significance, especially when there remain multiple variables that I can’t control.
So if nothing is working, why bother with this column? Since no one has really cracked the formula and shared their findings, I figured sharing what isn’t working can help people zero in tactics that have yet to be tested.
One item that stood out for me between sites affected by Panda and sites that weren’t is that the ones spared didn’t have any advertising. So a test I just initiated is the complete removal of advertising. It’ll be curious if this one remaining test turns out to be the winner, since it would mean Google Panda is essentially targeting sites using Google AdSense. That would be the web equivalent of shooting yourself in the foot, no?
For better or worse, Google My Business (GMB) and Knowledge Graph (KG) are transforming mobile local search. It pays to watch the areas of innovation, such as hotels, restaurants and movies as these signal Google’s intentions.
Click-through rates for a business website fall with its position in organic search results. But what is the effect when organic results are pushed further and further off screen by paid ads, Google My Business listings and Knowledge Graph?