Recovering from Google’s Panda Update

Posted by Wally Narwhal on Nov 14, 2013 2:47:00 PM


As the popularity of using search engines to find new businesses online increases, the amount of sophistication associated with Google’s search algorithm also increases. As we know, Google as a search engine started our way back in 1996 (ancient in terms of the Internet) as a simple search engine operating on the Stanford servers called BackRub. Fast-forwarding through 17 years of mind-boggling success we find ourselves at the current state of Google—a massive search giant that controls the world of Internet search.

Google has gone through great lengths in tweaking their search algorithm to the right degree to eliminate irrelevant websites and businesses. Just a few years ago the landscape of search engine optimization was like the Wild West; keyword stuffing, link farming, and other “Blackhat” tactics were the norm for companies looking to improve their standing on SERPs (search engine results pages). Since Google is ahead of everyone in terms of search engine optimization, they periodically decide to update their algorithm to penalize websites participating in “Blackhat” tactics.

Two of the more well known updates to Google’s search algorithm were dubbed Panda and Penguin. These may sound cute, but you wouldn’t be saying that if your website was affected by these updates and severely penalized by dropping in search rankings or de-indexed altogether.

The first Panda update took place in February 2011 and was aimed at negatively affecting “low-quality sites”. The initial change reportedly affected almost 12% of all search results (a massive amount), and there have been several updates to the first Panda update since. The process of the update used artificial intelligence in a more sophisticated and scalable way to analyze the quality a site’s content. This means sites with duplicate content, scraped content, and plagiarized pages saw a dramatic drop on SERPs. Also, the importance of PageRank in ranking sites dropped significantly when Panda rolled out.

If your site was affected by the initial Panda update or subsequent Panda updates, here are some things to consider:

  1. Check for Duplicate Content- Any webpage that is recreated or cloned on your website may contain duplicate content that is now deemed as harmful in Google’s eyes. Make sure to scan your site’s content to ensure that there aren’t any pages or content that appears twice.
  2. Block Pages with Duplicate Content- If you don’t consider getting rid of duplicate content, the least that you can do is to block the pages from being indexed by Google. You can do this by designating them as no-follow links so that Googlebots overlook them in their process of scanning the interwebs.
  3. Rewrite Duplicate Content Significantly- Google is smarter than you may think. The artificial intelligence that Google uses is also smarter than you think. We’re not suggesting that this is the same artificial intelligence that renders Skynet self-aware in the Terminator universe, but maybe we are. That means that if you choose to rewrite your duplicate content, make sure it’s original and of high-quality.
  4. Don’t Over-Optimize- Too many people think they can over-use keywords in their content and enjoy high rankings. What ends up happening is forced language that doesn’t sound human and a possible penalty in your search ranking. Include keywords in your content, but keep it to a natural level.

As the Google search algorithm becomes more and more sophisticated, you can bet the SEO industry will adapt to these changes. A general rule of thumb for an SEO is, once you’re feeling comfortable, you’re probably in trouble. Keeping up with these changes in definitely a full-time gig and very important in terms of your visibility on the web. Don’t get left in the dust, be aware of changes and adapt smartly!


Wally Narwhal

Written by Wally Narwhal

Wally overseas (get it?) fun and silliness at Tribute Media as the company's acting mascot and unicorn of the sea.

Please Leave a Comment: