Google Starts Indexing Facebook Comments

Facebook launched the commenting system for websites and blogs few months back, which allows websites owners to add a commenting system to their blog posts. This enables Facebook users to post comments on blogs and website using their Facebook account.

Similar third-party comment management systems, like Disqus and IntenseDebate make it easier for visitors to post their comments and point-of-view. However, these commenting engines are implemented in JavaScript and hence comments may not be indexed by search engine sites.

According to Digital Inspiration, it is now known that comments added using the Facebook commenting system or equivalent services are searchable, giving website owners the additional SEO juice from the added content, which could play an important role in a website’s search ranking.

Googlebots, or the spiders that crawl web pages, are now reading Facebook comments on websites just like any other text content and the more interesting part is that you can also search the text of these comments using regular Google search.

This means that – just like the web content, the comments on your blog/website will be indexed by Google. Queries like “commenter name and commenter title” will display all the comments the commenter may have written on different websites using Facebook or Disqus comments or other services.

This comment by Robert Scoble posted on a TechCrunch page using the Facebook comments system appears to be indexed on Google –

Google Indexes Facebook Comments

Well, the same was confirmed by Google’s Matt Cutts in a tweet – “Googlebot keeps getting smarter. Now has the ability to execute AJAX/JavaScript to index some dynamic comments.”

Although this may help blogs and website to rank better, the owner of the site must be extra careful and must consider moderating comments before they are published. Since comments will now be considered as a part of the content, website owners must take care of foul language, spam, hate speech or the like, which would directly violate Google’s Webmaster Guidelines. This might negatively affect the site’s search ranking.

Prior to this Google’s tweak, Facebook gave website owners an option to pull comments using the Facebook Graph API and render them in the body of the blog post behind the comments box. This, on the other hand, resulted in the creation of a WordPress plugin – “SEO Facebook Comment” which inserts all Facebook comments in your WordPress Database based on the Open Graph Tags.

Many website owners avoided the use of Facebook Comment Box because of the SEO factor, however, after this little “tweak”, I’m sure they will reconsider their decision.

How to Add Facebook Comment Box?

Adding a Facebook Comment Box is pretty easy. Just head to the Facebook’s Social Plugin page and provide the URL of your site. Enter the number of posts to display and the width of the comment box. Once done, hit the “Get Code” button. Copy the generated code and place it appropriately on your website.

Facebook Comment Box

You can also moderate comments, blacklist words and ban users. To moderate the comments, you need to list yourself as an admin. To do this, simply include open graph Meta tags on the URL specified as the href parameter of the plugin.

<meta property="fb:admins" content="{YOUR_FACEBOOK_USER_ID}"/>

You can moderate all comments at


Trying To Recover From Google’s Panda Algorithm? Some Mistakes You Should Avoid

If you have a couple of MFA sites and you specialize in auto blogging software for producing content, please skip this article.

So here we are, Google’s new wild animal (read Panda) is out from it’s cage and there has been a lot of speculation on the effects it has brought as a result. As before, some webmasters are on the winning side while some lost a significant proportion of traffic and rankings. Google did announced in their official blog post about the new algorithm and how it is geared towards increasing the user interaction and quality of sites in Google search results.



Back in December 2010, we told you how spam sites were polluting Google search results. Typically irrelevant content, little information, scraping, aggregating stuff from other sources and gaming the game of search to get ranks in SERP’s.

Seeing the immense amount of pressure from a wide variety of sources, Google had to do something about the search spam.

They introduced a chrome extension which allowed users to blocklist specific websites from search results and then releasing another feature which allowed users to blocklist sites directly from the search result pages. No wonder, this move was deployed to see the user behavior for specific queries/sites and cross check whether the results produced from the upcoming algorithm go hand in hand with the obtained data.

Tip: You can read our earlier tutorial to check whether your site is a victim of Google’s farmer update

After Google’s Farmer Update

There are two possible scenarios either you’re on the winning side or you’re on the losing one.

I am not going to discuss the details on why a specific site was penalized, Good content, trustworthy links and factors influencing the sudden downfall or raise of specific sites. If you’re a blogger or web publisher, chances are that you have already done your homework and know all the basic stuff.

Instead, I would want to put some light on things you should not do, trying to recover from Google’s Farmer or Panda algorithm.

Some possibly wrong assumptions:

1. Google’s Farmer Algorithm Is Incorrect

Just because you’re on the losing side, does not necessarily mean that an entire algorithm is wrong. Google has deployed this algorithm after months of testing and collected user feedback, why do you think that the same guy who sent you thousands of visitors every single day will turn its back all of a sudden ?

2. It’s Not Just Me, thousands are saying the same

Yeah right.

Care to row the boat and go to the opposite end of the river ? You will find people shouting Google’s Panda algorithm is wonderful, we are getting 3 times more traffic than before…Thanks Google !

3. I think I would register a new domain name and 301 redirect all the pages to my new domain. Someone told me that Google has put penalization on specific domain names and blacklisted all of them.

This is a crazy idea and should be avoided at all costs.

Do you think that the search bots are so foolish that they wont recognize the new domain being related with the older one ?

Let me assure you that domain name is hardly a factor, it’s the content, links, reputation, user friendliness and overall reach that counts.

4. Some scraper has copied my article and is ranking ahead of me. Doesn’t that sound absurd ?

This is a problem. And I have to say that Google is losing it’s edge here

First ensure that your website is not sending “content farm” signals to Google. Most webmasters either don’t use canonical URL’s in their theme or have the same content accessible via different URL’s, which confuses the bot and annoys them over and over again.

The only thing you can do here is file a DMCA notice and take down the scraped content yourself. This is a indeed very very difficult for large sites who have thousands of pages but you have to keep an eye on the scrapers and take them down, if they are ranking ahead of you in SERP’s.

5. Google says that low content pages are no good for users. Hence, I will delete all those blog posts that are less than 200 words, I don’t want low quality pages to hurt my high traffic pages that have valuable content.

Okay this one is a bit tricky but first you need to define a few things.

1. What is good content ?

2. Does Word count play any role in deciding the usefulness of a page ?

There is hardly a proper definition of what Good content is all about. It depends from one perspective to another and every website has it’s own way to benchmark the quality, usefulness and overall usability factorof the content they are creating.

And, neither is word count.

If a page has 2000 words on some topic and another page has only 300 words; it does not automatically guarantee that the earlier one is more rich in terms of content.

Enter word padding, keyword stuffing, sentence repetition, citation, irrelevant opinions and user comments.

What I am trying to convey here is that the same information can be masked in a label of 1000 word article, could have been said in lesser words.

5. What’s the harm in deleting low content pages ? I hardly get any traffic to those blog posts and I think they are hurting my pillar content

Yes, but at the same time you will lose the number of indexed pages and the page rank which used to follow through those pages.   I agree that your pillar content is the cannon but at the end of the day, it needs those tiny little matchsticks to fire a shell.

Removing all the low content pages will also result in a good number of 404’s, which might break your site architecture and you will lose the Google Juice flowing through those pages. Don’t make this mistake right now, you can always hammer down the tree later.

Instead, noindex, follow the archive pages of your blog which is nothing but a collection of content, actually residing on your single post pages.

6. I have linked to a lot of external sites in the past. Since the Farmer algorithm is live for everyone, I must remove all those links as they are draining the pagerank out of my blog

If you have sold text link ads or linked to untrusted sites who don’t produce original content, you might want to remove those links. But don’t overdo and start removing all the links from scratch.

Remember Linking to external sites does not necessarily reduce your site’s pagerank or authority and neither it drains out Google juice from your pages in that sense.

7.   I didn’t pay enough attention to link building earlier on. I will contact the SEO guy and buy 50 dofollow text links from authority pages to inflate my lost traffic

I doubt it, and I wont recommend this either.

The search bots can determine whether a link is natural or forced, so you might get that initial thrust but believe me if your main product ain’t right, every other effort will fail.

Things I Would Recommend Doing

I would recommend performing a routine check on the technical aspects first.

1. Login to your Google Webmaster tools reports and fix those crawl errors. 301 redirect the bad links to the actual pages, let the Google juice flow smoothly across your site.

2. Use Google’s URL removal tool and remove the previously crawled 404’s.

3. Check your robots.txt file, check for unnecessary directories deeper in your site which you forgot to include in the Robots.txt file.

4. Check those codes, not just the theme files but also the corresponding output in the browser.

5. Noindex, follow the tag, category and archive pages of your site

6. Be patient and don’t do anything silly just because some problogger wrote a possible fix in his MMO website.

7. There is gold in the archives – login to your web analytics program and find those pages whose traffic has reduced considerably. Compare these pages with the past data, try to find a pattern.

8. Remember that this is an algorithm which works the same way as Mathematics does. You can crack a 100 out of 100 in the next exam, if your current score is only 40.

At the end of the day, it’s you who have to find what’s wrong with your site and fix the problems. Me saying that this works and that doesn’t, is just a light in darkness. You have to analyze your current situation and make decisions. Some of these decisions will be hard and unjustified to other’s eyes, but remember that noone else can judge your site and it’s internal behavior as well as you can.

Facebook Comments Plugin Could Deprive Your Site Of Google SEO Juice

recently launched a new commenting system for websites and blogs which allowed website owners to add a commenting system to their blog posts.

No SEO Facebook Comments

The idea is very good considering that Facebook now has around 600 million users on the web and it would allow a user to comment with ease while allow them to also post those comments to Facebook itself. Facebook comments plugins also has a lot of other nice features like automatic plugin ratings based on likes and comments among other things.

However, the new Facebook commenting system also comes with a catch. The comments are only accessible to Facebook and cannot be indexed by Google or any other search engines for that matter.

Facebook Comments Bad for SEO and Webmasters

Commenting plays a big part on any blog allowing users to discuss a topic or add their own opinions to one. However, in addition to that comments are also a important part of SEO for a website, because they add value to it and Google often uses them while rating a webpage in their search results.

A very insightful post on Blind Five Year Old discusses this issue in depth and talks about how users comments are owned by Facebook. In fact I don’t think that even a webmaster has a way to export their comments from Facebook and integrate it in your own backend. This is a definite put-off for me since I would like to have control over data for my site.

Of course Facebook will more likely than not address these issues in the future, but it might take time. As of now, I feel that the Facebook comment plugin makes more sense on static websites which do not have a commenting system. I would definitely have loved to have tried out the Facebook comment plugins on the site, but these two pitfalls seem to be too big to ignore.

Do you use the Facebook comment plugin? Would you use it in future considering that you might lose out on Google rankings as well or not be able to re-import the comments to your blog? Do let me know through your comments.

Why Commenting On Do Follow Blogs Or Allowing Do Follow Comments Is Not A Good Idea

“Listen, we need to increase pagerank and to do that, we will need incoming links to our blog posts. If no one links to us, lets build a giant list of blogs who have commentluv plugin installed or who allow dofollow comments.

Once we have them bookmarked, lets post a blog post and start commenting on all the sites one by one.”

This is a crazy idea and there are reasons why you should avoid the “Do follow” comment system to “try” build up pagerank or allow do follow commenting on your own website, in order to make it “Sticky” in the sense.

When You Allow Do Follow Comments On Your own Site

Lets first think from the webmasters perspective and analyze what happens when you allow do follow comments on your own blog. Frankly, most bloggers who allow do follow comments think that doing so will encourage their site visitors to post more comments and spread the word, which hardly happens.

When you allow do follow comments, you are passing page rank from your pages to one or more of the external sites which may or may not have a relation to the context of your blog post, product, service etc.

So if one of your blog posts receive 20 comments and you are using the CommentLuv plugin or something similar, you are actually passing pagerank to these 20 external pages. Out of these 20 pages, chances are that only a handful might be related to the actual content of your article.

When the Googlebot crawls through those “do-follow” links, the Google Juice of your current page is diluted. In the worst case, the bots may have this reaction:

“Hey, this guy is linking to 20 different sites who don’t have any relation to the content of the page in question. Moreover, he has 350 blog posts and all these 350 posts link to typically irregular sites and blog posts which are not related at all. Examples: Cheap hotels, best student course and scuzzy links to MFA sites.

Why is this site suddenly linking to so many external sources, without any relevancy or value to the user ? What’s going on ?”

The human reaction is of course a bit different.

“Hey Mike, you might want to bookmark this site and call Joe to post 2-3 comments on every single page, whenever we have a new blog post up and running. You see, we get incoming links for free !

Just pay Joe some coffee money and ask him to comment in any case, whether or not he knows what he is saying, I don’t care. I just need those links.”

So typically, your blog post got 52 comments but only a few of them are legitimate.

You remove that do follow system and all the so called “popularity” will vanish in a flash. You’re living in a false impression that your site is getting the love from community. Wrong ! They are scavengers who can’t hunt, they only feed on the left outs.

I have seen some bloggers doing this and raising their collars “Whoa! I have 200 posts and 8000 comments. Surely, the do follow system worked for my site.”

Mr Do follow forced blogger-  That just doesn’t make any sense. You’re just inviting the link grabbers and screwing yourself in the bigger picture.

When You Comment On Do Follow Blogs Trying to “Win” PageRank

Now this is where you think that you are the sole gainer. You “Think”, that is.

You comment on Do follow blogs, get a few backlinks to your “linkbait” article and patiently wait to see whether your blog post gets more traffic because of increased rankings.

When I started two years back, I did the same. I bookmarked a whole bunch of sites who allow “Do follow” comments or have CommentLuv installed. Then I posted comments on some sites to see whether or not they have any impact on the rankings.

One month, two months, 4 months. Nothing happened.

Oh yeah, I used to get notifications in WordPress dashboard under “Incoming links” but this didnt had any effect on my site’s search traffic.

You’re Not the Only One Getting Linked From The Do Follow Site

You’re not the only one getting linked from the target site because of the “CommentlLuv” plugin. Chances are that the same page will receive more than 50 or so comments and your comment will lie somewhere in the middle.

So the same page is linking to 49 other sites and the overall pagerank that passes through each of these 50 external links is negligible. And with time, it will deteriorate  of course.

So the effective result is that only a negligible amount of link juice is passed to your site’s latest blog post.

Relevancy Is Almost Impossible To Achieve

So Mr Problogger posted about a topic and you rushed in to comment because his blog is a do follow one. His page is all about “iPhone news” while your latest post is about “soccer”.

Does getting a link from the target page matters ? Thing is, the Googlebot can determine whether this link is “Worth” or not.

Whats the  Certainty?

When the target site removes the “Do Follow” comments and all the “Comment Luv” links, you lose all those incoming links.

If someone is linking to you from the body of the article, he won’t remove it unless of course your site goes down after a few months. But with these Plugins, you never know.

The Position Of The Link Matters

A link placed under the body of an article carries much more weight than a link in the comments or in the blogroll. Rand Fishkin from SEOMOZ has come up with a great study involving do follow links, Anchor texts, Alt  attributes  and how the different position of a link matter, when you’re talking page rank and Google Juice.

So you get the idea.

One good link from the body of the article is much more valuable than 100 do follow links from the comment form. At least the former one is legitimate while the latter is forced.

Matt Cutts On “Do Follow Comments”

I am sure many will disagree with my point but here is Google Engineer Matt Cutts’s explanation on Do follow comments and their overall effect.

How Google Was Gamed To Get Traffic To JC Penney

NYTimes has uncovered a pretty significant link-scheme that was used to make JC Penney the number 1 search result for a number of keywords including “bedding”, “dresses” and other household products. Even though JC Penney claims no knowledge of it, the scheme has been going on since last 3-4 months and worked very well during the peak Holiday season.

One of the things that Google bases search engine ranking on is the number of incoming links to your website. However, the links should be genuine and come from genuine websites; even a slight deviation from these guidelines can land you in a lot of trouble even if you are as big a brand as BMW. There are certain SEO strategies that have been termed as Black Hat and it is not worth getting involved in those, as JC Penney has recently learned.

According to the investigative report, over 2000 pages that had little or no related content to clothing, posted links pointing to the dresses section of the JC Penney website. This included sites like , and many others including some that had no other content. Owners of some of these websites are paid rewards or even cash by link exchange companies like Once this scheme was discovered by a researcher hired by NYTimes, the results were forwarded to Google’s webspam team, headed by Matt Cutts. Since there is no direct proof of JC Penney being actually involved in the shenanigans, there website was not removed from Google’s index, however, a corrective action was taken that brought JC Penney website from #1 down to the 5th or 6th page in many instances.

More interestingly, JC Penney has already been penalized a couple of times by Google in past few months for violations of its guidelines. Even though Google has a team of experts that are continuously monitoring and putting solutions in place to stop such schemes, it is clear that given their limited resources they can only stop so much. This gives encouragement to many Black Hat based SEO experts who are paid big bucks to land a premium spot on Google.

How To Find Backlinks To A Website

Anybody who knows about search engine optimization understands how important high-quality backlinks are to a website. The more backlinks you get, the higher you rank for a search engine query. However, how does one go about seeing what backlinks a specific web page or website is receiving?

Here is a little tool to help you with that. Backlink Watch is a crappy looking but very useful little tool that gives you a detailed information of backlinks for any website. Just enter the URL and wait for the report to generate. The tool will tell you the total number of backlinks and then give details of each link including the PageRank info of the linking website as well as the number of outbound links the linking website has.

The tool also raises a flag if any abnormal tag, such as the nofollow tag, is used. The report also displays the exact anchor text used by the referring website. You can click on any of the links to go to actual referring web page.

Techie-Buzz Verdict:

Even though Backlink Watch has a poor interface, it is a highly functional and useful tool to get backlink information for any website. It can be used as a first step to analyze the number and quality of backlinks a website is getting so appropriate steps can be taken to optimize the search engine ranking.

Rating: 3/5

Comment Spam Can Hurt Your Search Engine Ranking

If you go to any large blog, you will see comments that won’t say anything useful but would have 1 or 2 links pointing to their own website. If you are not sure what I am talking about, here is an example:

People think that these sort of spammy comments would actually increase their ranking in the search engines. Google explains that it is not the case. Google in fact has a lot of algorithms in place that can identify these spam comments and devalue the links posted through them. Webmasters should instead spend that time on genuine ways to increase traffic to their website. If you have previously posted such comments on other blogs, you can always try to delete them and submit your website for reconsideration.

In short, only post a comment if you have something worthwhile to say. It is also wise for webmasters to protect their site/blog for spam by using comment moderation, CAPTCHA or other strategies.

Optimizing Images for Organic SEO Traffic

Images can speak a thousand words. This quote is so true. However, there is a condition to it, images can only speak a thousand words, if someone can see it. That though, is not the case with Google Image search robots, or for that and other image search engine robot.

Optimizing images for SEO involves adding proper metadata and giving the images proper names. So naming images as 1.jpg or 2.jpg will not earn you any points from these blind robots.

According to Matt Cutts, Google Image Search considers the metadata in images, as well as the URL of the image. However, like always he also advices against using black-hat SEO techniques and spamming the URL with too many keywords.

Watch a short video to see how Google considers the URL of an image to include it in organic image search results.

In addition to that also consider using altparameters in images. Search engines like Google consider the alt tag, to understand and get more information about an image.

Are you optimizing your images for search engines? Do let us know. Don’t forget to check the huge archive of to enhance and optimize your website.

Google Says Meta Keywords Are Useless

If you have been a SEO or have at-least been doing some SEO for your website or blog for a long time you might already know that Google ignores meta keywords.

The Google Webmaster blog today made a official announcement stating that they do not consider meta keywords, doing away with a very old SEO myth that keywords can help you gain more traffic.

This is done for obvious reasons as many SEO consultants started spamming search engines by adding spam keywords to webpages. If you have been thinking that adding keywords is going to help you, you know now that is practically useless.

Watch a video where Google webspam head talks about the meta keywords and why Google does not consider them.

For a complete list of meta tags that Google supports visit the meta tags webmaster help page.

Rank Buzz: Domain Information and SEO Tool

Rankbuzz is a web base domain information and SEO utility, which provides you with number of useful pieces of information about a website. For instance Pagerank, Alexa rank, social links, Twitter mentions, number of indexed pages, site value etc.,

Just enter the URL of a website and hit the ‘Go’ button to get useful pieces of information about any website. For each statistic being displayed, it also provides useful suggestions and comments accordingly.

Rankbuzz can serve as a handy way to easily know domain information about any website. It’s key features include,

  • Social links chart
  • Indexed pages chart
  • Social links
  • Twitter mentions
  • Alexa graph
  • On site information
  • Site value etc.,
  • Quantcast, Technorati and Compete rankings.