Matt Cutts: New Algorithm Will Reduce Google Search Spam, Expect Better Results Soon

A few days ago Google gave some hints on how Google web spam team is changing the way web pages are ranked and trying to implement a redesigned “document level classifier” in Google search algorithm.  This whole saga of Google search spam got ignited by Jess Atwood’s post at CodingHorror; so if you have missed the details, read that article and our response article on the issue.

Earlier today, Google Engineer Matt Cutts announced that a new algorithmic change has been launched which will rank the content scraping sites lower in search results. Hence, users are more likely to see those sites higher in search results, who wrote the original content.

Matt said that this change was very much  targeted  and geared towards improving the overall experience of users.

Note that Matt says “lower in search results” and not that these scraped content will never appear in a search result page.

Let’s quickly take an example.

We announced an Apple Ipad Giveaway on 23rd december 2010, which received 254 responses and was pretty much successfull.

Some websites thought it’s wise to scrap that content and put their own “mix” to the scraped material. They changed the Title, modified the URL (thinking that adding the author’s name might just work) and added the same sentences, keywords over and over in the meta description. The result is something like this

The first search result points to this site (thanks Google for your algo).

My point is: Some users will still click the second and third links and arrive to that scraped website who have no original content. Because they are not organizing the Giveaway and can never give an Apple iPad on behalf of Techie Buzz. ( Those who need proof, can read this research article on “Eye tracking in  search results” (PDF), learn the facts and then comment on this post).

So what happens is that some users arrive to the scraped website, can’t find what they are looking for and quit. The will simply go elsewhere.

The Result: We lose those prospective readers who are searching our website,  just because Google showed the scraped sites on search result pages. Agree the number of such readers is way less, but it’s never ZERO.

Our Suggestion To Google Web Spam Team

Don’t show the scraped sites at all. Never. I mean “Why ?”

Google Engineers can easily judge whether the content is an exact photocopy of the source website or not, so there is no point in showing these sites on any of the search result pages. Not even on the 99th page.

Learn from Bing

Surprisingly, the same search at shows only one spam link (third one).

This is just an example and I am not saying that Bing is better than Google. But as you can see – Bing shows less number of scraped sites, when you consider a long tail of search.

@Google: We are Praying for better search results.

Facebook Removes Suggest Fan Page To Friends Feature

You might see a considerable drop in the number of fan page requests from Facebook friends, the next time you login to your Facebook account. It turns out, Facebook has recently removed the “Fan page suggestion” feature for all members of any fan page.

If you are the administrator of a Facebook Fan page, you can still see the “Suggest page to friends” link; while normal members can’t. It’s gone!

Sadly this shouldn’t stop those, who  befriend  random strangers on Facebook just for the sake of suggesting their website’s official fan page to their entire friend list. Page admins can still suggest their fan page to all friends, which should have been removed too.

However, this should put an end to the cross fan page promotions, which is sometimes very spammy in nature. Mike suggests John’s fan page to his friends, John suggest Mike’s and eventually a large group of page admins join hands to spam their friend list with unnecessary fan page promotion.

image credit

For non bloggers and Facebook addicts, this is a huge loss. You have to manually email the link of the fan page to friends or post it on your Facebook wall.

Note: Removal of the “Suggest” button from fan pages can also be a part of bug fixing, as detailed in this post

What about you? Can you still suggest any fan page to your Facebook friends using the “Suggest” button? Share your thoughts in the comments below.

Google Talks About Search Quality Spam: Improved Results Underway

A few days back we told you how Google search results were being intruded by content farms and “agregators” who try to manipulate search results with black hat techniques. Recently,  there has been much speculation on the way Google search is populated with content aggregators and   scraped websites.

Earlier today, Google Engineer Matt Cutts published a blog post saying “We are listening as more algos and spam detection techniques are on our radar”.

Here is what Matt said:

We have seen a slight uptick of spam in recent months, and while we’ve already made progress, we have new efforts underway to continue to improve our search quality.

Prior to this, Matt explained that Google’s search quality is better than before in terms of relevance, freshness and comprehensiveness. Web spam in English is less than half what it was five years ago and spam in other languages is lower than English.

It’s interesting to note that Google rolled out a PR update earlier today and the blog post also discusses about a redesigned “document-level classifier” being introduced. This classifier will detect on page spammy content, junk and repetitive words that typically appear in blog  comment  forms, thus making it difficult for spam sites to rank ahead of the original source in SERP’s.

In addition, Google has also improved the way hacked sites are shown in search results. Matt says “We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower”.

The blog post didn’t gave any clue on why the scraped sites were able to manipulate search results. However, Matt makes it “crystal clear” that Google takes “action” against sites that violate their webmaster quality guidelines regardless of whether they are running Adsense ads on their sites.

Indirectly, Google clears the air that although they take away 32 % share of all the Adsense money from Publishers, they are strict to ban folks who try to degrade the overall search experience by running Adsense on scraped sites with no original content.

As webmasters, we can only hope. That’s all.

Oh and if anybody from Google is reading this, we have a suggestion. Showcase the spammers on a new “Spam bar” in the right of search results. Way better !

Hey Google, Can We Have Some Meaningful Results? Please

You don’t use Google because it’s “Google”. You use “Google” because it offers relevant and meaningful results.

AdSense Scraper Sites Joke

Image credit

Back in the year 2000 – web search was largely dominated by short web queries and the results were very “generalized”, users had to click through multiple links until they were able to find the exact webpage they were looking for. The search volume was fairly low, hence the requirement for a polished algorithm was not laid forward.

Also Read: Hey Google, Here is How You Can Make Google News Spam Free

Then came Google and web search took a significant turn in it’s “relevancy”, “exactness” and “value”. Google changed the way   folks used to find content, people started writing blogs, created information and Google organized all the information in their “index”.

The sad part – Newton’s third law of motion holds true universally.

The reaction: Google’s way of ranking webpages was huge, so big that it gave birth to a giant search marketing industry. The welcome mat for SEO Firms was laid out, which consequently gave birth to content farms, aggregators and scraped websites. Their entire goal of producing content is from the outside in – find the phrases which are profitable, judge the competition, gain a good amount of links and make your way to the SERP’s.

In recent times, there has been much speculation on the way Google search is populated with content aggregators and scraped websites. TechCrunch, Venture Beat and other tech blogs have put down their observations earlier.

It’s not that only we are saying this. Jeff Atwood from Coding Horror recently nailed down their observations

In 2010, our mailboxes suddenly started overflowing with complaints from users complaints that they were doing perfectly reasonable Google searches, and ending up on scraper sites that mirrored Stack Overflow content with added advertisements. Even worse, in some cases, the original Stack Overflow question was nowhere to be found in the search results!

Why Do Content Aggregators Beat The Source In Search Results ?

Let’s take a look on the factors that influence the ranking of a webpage in search results.

Google says that there are more than 200 signals but practice shows that the most important factors which determine whether you page is going to rank well or not are:

1. The quality of the content
2. Number and authority of the links to your content.
3. The Title Of the page.

Let’s say you wrote an informative article on “iPhone cases” and published it on your blog. Since “iPhone cases” is a profitable phrase, this will alert the content aggregators, link farms and scraper sites running AdSense ads. These guys have set up Google Alerts and other ways to get instant notifications whenever a phrase they are targeting, gets found by Google.

They come to your blog, copy an excerpt from your article, publish a new post on their “Aggregation channels” with a link back to your post.

You think it’s cool? ” “Hey Mike, I just got a backlink from they have 60,000 RSS readers. I am famous !”

Yes, a lot of people will read your thought but not on your blog. The majority will read it on the “Aggregation channel”.

The Result

Since a large portion of the blogosphere is paying attention to that Aggregation channel, who do you think is going to attract more links?

Social Media Influence Aggregators

As soon as the scraped post hits an “Aggregation channel”, a huge number of blogs will start linking to them (not you). They have an enormous amount of social media subscribers, Twitter followers, newsletter readers and getting 6 dozen backlinks to their own article is child’s play.

Yes, you have written that 900 word article but that doesn’t qualify your post to be linked from other sites. Social media influences search rankings and these content aggregators use huge social media influence to dominate SERP’s.

Conrad Sam from Search Engine Land performed an internal study on profitable keywords, in order to find out which search engine (Google or Bing) returns relevant results. The SEL team followed this convention:

  • 5 points were awarded for a good quality result ranking first, 3 for second and 1 for third.
  • 2 bonus points were added for top 3 results being on a highly authoritative site.
  • 5 points were subtracted if the entire first page didn’t contain any good results.

Search Engine Ranking Authority
It’s no wonder that Google is losing it’s ground on relevancy and exactness. Richard MacManus from Read Write Web puts an interesting comment on why blogs and search engines need to embrace a “change”

I can only hope that Google and other search engines find better ways to surface quality content, for its own sake as well as ours. Because right now Google is being infiltrated on a vast scale by content farms.

That pretty much sums it up. Google has been promoting leechers more than the actual publishers and we ourselves have been victim of this problem. You can read more about it on this post.

What is your opinion of the quality of search results in Google? Are you happy with them? Do you think that they need to improve? Are you a victim of RSS scrapers? Don’t forget to tell me your thoughts through your comments.

Google Releases API For URL Shortener

Some good news for developers and coders who use Google’s URL shortening service and always wanted a API for their web applications.

Google has recently released an API for which allows users to integrate Google’s URL shortener in their web applications, blogs or websites. You can use simple HTTP methods to create, inspect, and manage short URLs from desktop, mobile, or web application.

Google’s URL shortening service is one of the fastest URL shortners out there. The only near competitor of is, which also provides analytics for the shortened URL’s apart from providing their own API to developers. It looks like Google wants to level the playing field in the URL shortening market by allowing coders and developers the ability to integrate URL shortening service in their products and apps.

Getting Started With API

The getting started page at Google code lists all the step by step details for developers who want to use in their web properties. First, you will need to get your API key from the console page, which is required to identify your application and pass different arguments or parameters. Here is how the API page looks like:

Scroll down to the bottom of the page, find the URL shortener API section and hit the “Activate” button. All done, you will be given a unique authentication URL as shown below:

After the authentication part is complete, you can head over to the Actions page and learn how to use URL shortener API and choose the different actions required by your application.

For development purposes, you can issue API calls without a developer key, but using a key will grant you much higher usage limits. The advantage of using API is that apart from shortening and expanding long URL’s within your application, you can also fetch history and analytics of the shortened URL’s. Common examples include auto shortening URL’s from a custom Twitter client, shortening the long link of your blog post – the possibilities are endless.

Do give API a try and let us know your ideas in the comments section. [via Google Code blog ]

MiniBin Let’s You Manage Windows Recycle Bin From System Tray

Need to quickly empty the Recycle bin and can’t find the Recycle bin icon on Windows desktop ? MiniBin is a freeware utility which lets you manage Windows Recycle bin following a nifty little system tray icon.

Whenever you delete files and folders from Windows, the file or folder is moved to Windows “Recycle bin” folder, so that you can recover the same file at a later point of time. If you are the type of person who deletes a whole bunch of files every single hour and can’t find the “trash” folder among 52 icons on Windows desktop, MiniBin might just save the day.

There aren’t much options though, you can either empty the recycle bin folder by right clicking the system tray icon or choose to open the Recycle Bin folder, as shown below:

The application is useful when you are working with a lot of open applications and hate minimizing each of them, just to find the Recycle Bin icon on desktop, right click it and empty the trash. Then you have to maximize all the minimized applications to their previous state. Keyboard ninjas might make good use of   “Win key +D” but folks who never want to learn any keyboard shortcut will find Mini Bin useful. [via ]

Also read: iBin – a portable recycle bin for your USB drive

Techie Buzz Verdict

MiniBin is a portable app, thus will work on any computer from a removable drive without any installation. If you are a coder and always delete a lot of codes only to restore them at a later point of time, get MiniBin today. Useful !

Techie Buzz rating: 3 (Good).

CloudShopper Let’s You Gather Shopping Advices And Gift Ideas From Facebook Friends

Planning to buy some gifts this new year? How about searching for gift items at Amazon, preparing an online wish list of gifts and asking your Facebook friends for their ideas and suggestions ?

Meet CloudShopper

CloudShopper is a fairly new website which attempts to fill the gap between building wish lists of online gift items and gathering suggestions from your social circle. Works quite simple – go the website, connect your Facebook account, search for a product and post the link on your Facebook wall.

Your friends and followers can immediately see your wall post and comment on it as they usually do. CloudShopper aggregates all these comments to your CloudShopper dashboard, so that you can read the suggestions posted by your Facebook friends and archive the recommendations for future use.

Think CloudShopper as a personal review page where you can aggregate the suggestions, comments and reviews   made by your Facebook friends, without having to search for older wall posts that attracted 42 comments a month ago.

Another neat thing regarding CloudShopper is that you can find new as well as used products from and compare their prices directly from your CloudShopper account. The site also allows you to share updates with selected Facebook contacts, so if you want to buy a gift for parents – you can include only your family member’s Facebook profiles.

The “Friends” tab shows a series of wall posts and links posted by your Facebook friends on their profiles, so you always know who is buying what and offer your geekiest suggestions. Want to buy a group gift for a common Facebook friend ? Create a CloudShopper list, invite all the group members, post your product links and comments on the list page and let the conversation begin !

Overall, Cloudshopper is a nice way to plan and gather feedback about a specific product from your social circle. The site is only a month old, so friend recommendations, likes, and other Facebook goodness is expected to be rolled out soon. Give this a try !

Fake Facebook Profile? Maybe It’s Your Real Life Facebook Look Alike

Facebook has an official page where you may report abuse a Facebook account which is impersonating your Facebook profile, stealing photos, profile information and other personal assets.

In case your Facebook profile has been impersonated, go to the imposter”s Facebook profile page, scroll down and click the “Report Abuse” link. Then add all the necessary information and the reason of reporting the profile as “spam” and wait for the Facebook authorities to do their job.

But before you go ahead and hit the “Report Abuse” button, be sure that the profile which you are abusing is not your real life Facebook look alike. A Facebook double, literally.

This is what exactly happened with two photographers – Graham Comrie (left) and Graham Cormie, who found their real life doubles on Facebook.

The story goes like this –

1. Graham Comrie feared that someone is impersonating him online when one friend emailed him saying, an unknown person is pretending to be “him” on Facebook.

2. The impersonator was even using his photos and the profile image Comrie had on his own Facebook account.

3. Graham Comrie started investigating the impersonator’s account until he found out that the person was “Real” and actually his Facebook look alike.

4. Comrie contacted his Facebook look alike and to his amazement, found out that both of them shared quite a few common things apart from their names, faces and profession.

Here are the similarities between Graham Comrie and his Facebook look alike Graham Cormie:

  • Both the persons are of the same age.
  • Similar names with exactly similar letters with a slightly different combination on the surname – Comrie and Cormie.
  • Both are photographers by profession.
  • Both the men have red headed wives and both couples are due to celebrate their silver anniversaries next year.
  • Both have two daughters who own Lhasa Apso dogs as pets.

Too many coincidences. Just too many !

Graham Cormie said to The daily record:

It’s all very confusing. I too thought someone was impersonating me.We were both getting emails from the wrong people, asking – “When are you coming in to take my photo?’ or ‘When are we doing this shoot?’, and I’d never heard of them before.

When I saw Graham’s picture, I was completely gobsmacked.We could’ve been separated at birth. I call him my nemesis.

If you ask me, the photos do not appear as exact “Look alikes”. The facial attributes are close but other details are really very surprising. I just hope there ain’t any Facebook double of mine, elsewhere on the planet. :-) Automatically Bookmarks Your Tweets On Delicious, So You Can Find Them Later

Do you tweet a lot of links all day long ?

Ever wondered whether there is any service which can automatically bookmark your Tweets that contain links?

Enter – a brilliant web service which makes bookmarking tweets as easy as child’s play. Using, you can bookmark all of your Tweets that contain links, can convert Twitter hashtags to Delicious tags and label a unique Delicious tag to each tweet. Everything on Autopilot.

Wait…. isn’t Delicious Shutting down? No It isn’t, the service will continue to run as earlier.

How To Bookmark Tweets That Contain Links In Your Delicious Account ?

Following are the steps involved to get started with and connecting the service with your Delicious account:

1. Head over to the homepage, and sign in with your Twitter account. After you have granted permissions, you will be asked to enter an email address where the service can send you future notifications.

2. Enter your email address and hit “Save and Continue”. Fall back to your email inbox and click the verification link to continue to the next step.

3. Now you will be asked to connect your Delicious accounts with There are two types of Delicious accounts – the independent Delicious account and the older Delicious, which works with a valid Yahoo ID.

Make sure you choose the correct type of Delicious account or else the service won’t be able to bookmark the tweets. As for me, I have a yahoo linked Delicioius account, hence I chose the first option.

4. On the next page, hit the “Authorize” button.

5. All done, now you will be redirected to the page where you can tweak certain options as shown underneath:

There are basically 5 options which you may configure:

  • Expand URLs: Selecting this checkbox will automatically expand all the shortened links embedded in your tweet and post it to your Delicious account.
  • Replace Bookmarks: Self explanatory, selecting this checkbox will automatically remove the duplicate bookmarks from your Delicious account.
  • Convert Hashtags: This setting takes care of whether you want to convert Twitter hash tags to Delicious tags or not.
  • Remove These Tags: If you use a lot of hash tagged tweets, you may define some of the hash tags to be ignored by the service.
  • Add A particular tag: You may define a particular tag in this text box to make sure each link that that is sent via Packarti is accessible through that particular bookmark tag.

All done, but don’t forget to hit that “Save” button once you are done with the settings.

Now you can continue using Twitter and Delicious as before. Any of your tweets which contain a link will be autosaved as a bookmark in your Delicious account.

For example: I posted this tweet on Easter Egg In Google CR-48 Netbook and it was immediately sent to my Delicious account.

Techie Buzz Verdict

Lifesaving !

Using, I can automatically bookmark and archive all the links which I tweet everyday. Later I can simply export the bookmarks from Delicious and import it in Google Bookmarks or to my preferred browser.

Techie Buzz rating: 5/5 (Perfect).

Customize FireFox’s Default Download Manager With Download Manager Tweak

Firefox’s default download manager is great but it has a few limitations.

There are only a few options provided e.g list of downloaded items, the clear list button and search. If you are a power user and looking for a better and enhanced download manager for Firefox, try Download Manager Tweak.

Using Download Manager Tweak, you can get more options like Download them all, a status bar for file downloads, a customized tool bar with launch, “Remove from list” button, “Delete files”, making queues etc .

You can keep an eye on the download progress, can pause specific items and can open the directory where the file downloads are saved. After installing Download Manager Tweak, the default download manager of Firefox will look like the following:

As you can see, there are more advanced options provided at the top of the download window as well to the right of each download item. You can download the files in a new tab, in Firefox’s side bar, Firefox’s tab bar or in a new browser window.

The biggest advantage of the Download Manager tweak extension is the ability to re download items which were deleted by mistake. Simply select the name of the file or item from the list and click the download button which appears automatically at the time of choosing the file .

The Settings tab provides a couple of more options, which you may tweak. You can choose a custom delay time to start the file downloads, can limit the number of download items by days and so on. Should you want to show only specific buttons on the download window, simply select them from the add-on options, as shown below:

Techie Buzz Verdict

If you download a lot of files but don’t use a dedicated download manager or accelerator, it’s a good idea to use the extension and add more features to Firefox’s default download manager. [via]

Techie Buzz rating: 3 (Good).