Reverse Engineering Google’s Panda Slap, Hubpages Seeing Improvements After Offloading Content To Subdomains

It has been over four months since Google’s famous Panda algorithmic update, also known as Farmer update went live globally. This Panda update is one of the most devastating algorithmic changes ever, crippling site traffic of thousands of sites in a flash. There are so many forum threads where webmasters   have said that their site has just disappeared from Google search. Publishing platforms, article directories, content sites, blogs, forums or any other web property which has a lot of content were most affected by this algorithmic change.

Why this new algorithm? Because Google has been under attack from content farms and spam aggregators and they had to do something about it.

To this day, most of the webmasters have no idea what is the exact problem which led to the penalization of their site(s). Here are some case scenarios:

  • Is it just the content on the site which is considered thin and shallow in nature?
  • Or the incoming links have lost their weight post panda?   Because the sites linking to you have lost their value (assumption).
  • Were the pages been knocked off because of competition?
  • Duplicate content or canonical issues within the source code?
  • Scraper sites outranking the source for the content they have written?
  • Too many advertisements on the site   or the Ad to content ratio is way above the line.
  • A large volume of user generated content which was hastily produced and don’t serve any value.

Many possibilities.

Since no one has yet recovered from Google’s Panda slap, it makes sense to conclude it’s not just one factor.

Published by

Amit Banerjee

Amit has been writing for Techie Buzz since early 2009 and keeps a close eye on web apps, Google and all things Tech. He also writes at his own tech blog, Ampercent. Follow him on Twitter @ amit_banerjee

  • cameron

    I am a freelance writer on the web and I have been reading several posts regarding the usage of subdomains and the pros and cons of using them. Hubpages is such a big site and it seems like a good idea for them since they have an open platform. It will be interesting to see how it pans out in the long run.

  • Stefan

    Putting content on different subdomains may be a strategy for large content ‘farms’ or whatever you want to call sites like hubpages, squidoo or blogspot. But this only keeps the penalty from spreading over the entire domain – it doesn’t solve the root problem.
    Like a ship having a hole with water leaking in – shutting all doors will keep the ship from sinking but doesn’t fill the hole.

    What I have found is what after Panda matters most is how visitors behave around your site: how long they stay on your site, how many pages they visit, how many of them bounce back to Google and search for something else. Basically Google now let’s the users decide what they like. This type of ranking factor cannot be manipulated as easily as incoming links.


  • kishore

    Duplicate content and page loading time are the big factors for the penalization.