Trying To Recover From Google’s Panda Algorithm? Some Mistakes You Should Avoid
By on April 17th, 2011

If you have a couple of MFA sites and you specialize in auto blogging software for producing content, please skip this article.

So here we are, Google’s new wild animal (read Panda) is out from it’s cage and there has been a lot of speculation on the effects it has brought as a result. As before, some webmasters are on the winning side while some lost a significant proportion of traffic and rankings. Google did announced in their official blog post about the new algorithm and how it is geared towards increasing the user interaction and quality of sites in Google search results.

google_farmer_panda

Rewinding

Back in December 2010, we told you how spam sites were polluting Google search results. Typically irrelevant content, little information, scraping, aggregating stuff from other sources and gaming the game of search to get ranks in SERP’s.

Seeing the immense amount of pressure from a wide variety of sources, Google had to do something about the search spam.

They introduced a chrome extension which allowed users to blocklist specific websites from search results and then releasing another feature which allowed users to blocklist sites directly from the search result pages. No wonder, this move was deployed to see the user behavior for specific queries/sites and cross check whether the results produced from the upcoming algorithm go hand in hand with the obtained data.

Tip: You can read our earlier tutorial to check whether your site is a victim of Google’s farmer update

After Google’s Farmer Update

There are two possible scenarios either you’re on the winning side or you’re on the losing one.

I am not going to discuss the details on why a specific site was penalized, Good content, trustworthy links and factors influencing the sudden downfall or raise of specific sites. If you’re a blogger or web publisher, chances are that you have already done your homework and know all the basic stuff.

Instead, I would want to put some light on things you should not do, trying to recover from Google’s Farmer or Panda algorithm.

Some possibly wrong assumptions:

1. Google’s Farmer Algorithm Is Incorrect

Just because you’re on the losing side, does not necessarily mean that an entire algorithm is wrong. Google has deployed this algorithm after months of testing and collected user feedback, why do you think that the same guy who sent you thousands of visitors every single day will turn its back all of a sudden ?

2. It’s Not Just Me, thousands are saying the same

Yeah right.

Care to row the boat and go to the opposite end of the river ? You will find people shouting Google’s Panda algorithm is wonderful, we are getting 3 times more traffic than before…Thanks Google !

3. I think I would register a new domain name and 301 redirect all the pages to my new domain. Someone told me that Google has put penalization on specific domain names and blacklisted all of them.

This is a crazy idea and should be avoided at all costs.

Do you think that the search bots are so foolish that they wont recognize the new domain being related with the older one ?

Let me assure you that domain name is hardly a factor, it’s the content, links, reputation, user friendliness and overall reach that counts.

4. Some scraper has copied my article and is ranking ahead of me. Doesn’t that sound absurd ?

This is a problem. And I have to say that Google is losing it’s edge here

First ensure that your website is not sending “content farm” signals to Google. Most webmasters either don’t use canonical URL’s in their theme or have the same content accessible via different URL’s, which confuses the bot and annoys them over and over again.

The only thing you can do here is file a DMCA notice and take down the scraped content yourself. This is a indeed very very difficult for large sites who have thousands of pages but you have to keep an eye on the scrapers and take them down, if they are ranking ahead of you in SERP’s.

5. Google says that low content pages are no good for users. Hence, I will delete all those blog posts that are less than 200 words, I don’t want low quality pages to hurt my high traffic pages that have valuable content.

Okay this one is a bit tricky but first you need to define a few things.

1. What is good content ?

2. Does Word count play any role in deciding the usefulness of a page ?

There is hardly a proper definition of what Good content is all about. It depends from one perspective to another and every website has it’s own way to benchmark the quality, usefulness and overall usability factorof the content they are creating.

And, neither is word count.

If a page has 2000 words on some topic and another page has only 300 words; it does not automatically guarantee that the earlier one is more rich in terms of content.

Enter word padding, keyword stuffing, sentence repetition, citation, irrelevant opinions and user comments.

What I am trying to convey here is that the same information can be masked in a label of 1000 word article, could have been said in lesser words.

5. What’s the harm in deleting low content pages ? I hardly get any traffic to those blog posts and I think they are hurting my pillar content

Yes, but at the same time you will lose the number of indexed pages and the page rank which used to follow through those pages.   I agree that your pillar content is the cannon but at the end of the day, it needs those tiny little matchsticks to fire a shell.

Removing all the low content pages will also result in a good number of 404′s, which might break your site architecture and you will lose the Google Juice flowing through those pages. Don’t make this mistake right now, you can always hammer down the tree later.

Instead, noindex, follow the archive pages of your blog which is nothing but a collection of content, actually residing on your single post pages.

6. I have linked to a lot of external sites in the past. Since the Farmer algorithm is live for everyone, I must remove all those links as they are draining the pagerank out of my blog

If you have sold text link ads or linked to untrusted sites who don’t produce original content, you might want to remove those links. But don’t overdo and start removing all the links from scratch.

Remember Linking to external sites does not necessarily reduce your site’s pagerank or authority and neither it drains out Google juice from your pages in that sense.

7.   I didn’t pay enough attention to link building earlier on. I will contact the SEO guy and buy 50 dofollow text links from authority pages to inflate my lost traffic

I doubt it, and I wont recommend this either.

The search bots can determine whether a link is natural or forced, so you might get that initial thrust but believe me if your main product ain’t right, every other effort will fail.

Things I Would Recommend Doing

I would recommend performing a routine check on the technical aspects first.

1. Login to your Google Webmaster tools reports and fix those crawl errors. 301 redirect the bad links to the actual pages, let the Google juice flow smoothly across your site.

2. Use Google’s URL removal tool and remove the previously crawled 404′s.

3. Check your robots.txt file, check for unnecessary directories deeper in your site which you forgot to include in the Robots.txt file.

4. Check those codes, not just the theme files but also the corresponding output in the browser.

5. Noindex, follow the tag, category and archive pages of your site

6. Be patient and don’t do anything silly just because some problogger wrote a possible fix in his MMO website.

7. There is gold in the archives – login to your web analytics program and find those pages whose traffic has reduced considerably. Compare these pages with the past data, try to find a pattern.

8. Remember that this is an algorithm which works the same way as Mathematics does. You can crack a 100 out of 100 in the next exam, if your current score is only 40.

At the end of the day, it’s you who have to find what’s wrong with your site and fix the problems. Me saying that this works and that doesn’t, is just a light in darkness. You have to analyze your current situation and make decisions. Some of these decisions will be hard and unjustified to other’s eyes, but remember that noone else can judge your site and it’s internal behavior as well as you can.

Tags: ,
Author: Amit Banerjee Google Profile for Amit Banerjee
Amit has been writing for Techie Buzz since early 2009 and keeps a close eye on web apps, Google and all things Tech. He also writes at his own tech blog, Ampercent. Follow him on Twitter @ amit_banerjee

Amit Banerjee has written and can be contacted at amit@techie-buzz.com.
  • http://www.webenthu.com webenthu

    Amazing article. Although most of sites havent dropped in this update, trying to research and some points to improve on them. Thanks!

  • Pingback: @WinObs Tweeted Links for 18 April 2011 | WindowsObserver.com

  • http://www.dealpocket.com/ rayan

    At the time of writing, Panda is only hitting US results.

  • http://ashirwaadholidayaptsgoa.blogspot.com/ Rooms in goa

    Amit, I liked the way you closed your article. Even if one of our blogs has been affected by the Panda update, we can still rectify it. Thanks for the insights.

  • Abhijeet

    None of my websites are affected by panda, here is the secret.

    The secret of not getting bitten by Panda is not to use google analytics or webmaster tools. When google analytics launched i put it on 1 client site, the sites was ranking well for some niche keywords, after one month of putting the google analytics adword ads started displaying above my client website SERP top ranking. Since then i have never used google analytics or webmaster tools for any of my sites or my clients sites and none of them are affected by Panda.

    Here is how to recover from google Panda – remove google analytics and webmaster tools from your site, remove adsense for some time and switch to another adnetwork, slowly your site will start recovering, since panda cannot sniff traffic data anymore.

    • http://www.ampercent.com Amit

      @Abhijit” Can’t agree with you Abhijeet on the point that Google Analytics and Google Webmaster tools data is used as a ranking factor in Panda update. Google search quality team do not have access to Adwords, Adsense , Google Analytics, Google webmaster tools and other Google services. They cant see anything because they don’t have the access to the data. Instead, the search quality team works universally to improve the quality of search results.

      In your case, it might be a coincidence but I don’t think removing Google Analytics and Google webmaster tools from your site is going to help for that matter.

    • Swapan

      @Abhijeet says makes sense since lots of times we forget google is not just the largest search engine, they are also worlds largest Ads brokers.

      Anyone can try this simple experiment, if you have a web pages getting more than 1000 uniques monthly and is ranking well for niche keyword combination that attract visitors. Try adding google analytics to the site within 1 month adword ads will start appearing above your top ranking SERP keyword combinations.

      Google collects traffic data through google analytics, google adsense, pagerank toolbar, google chrome browser, googleAPIs. Google does collect lots of user data and use it against ignoramus webmasters.

      • http://www.ampercent.com Amit

        @Swapan: I agree that Google colects Traffic data from Google toolbar, Chrome and other Google products but the search quality team has NO ACCESS to Google Analytics and Google webmaster tools site specific data.

        Google is a data driven company and they need user data to improve their system, services and products. But think, if they use Google analytics and other site specific data to devise an entire algorithm .. their results wont be that accurate.

        I agree Google has messed things up in this new algorithm and there have been a lot of consequences of the “PANDA effect”. But remember the fact that Google’s algo has to work universally for each and every site – whether they are using Google analytics or not.

 
Copyright 2006-2012 Techie Buzz. All Rights Reserved. Our content may not be reproduced on other websites. Content Delivery by MaxCDN