Google eBooks Gets New Features – Translate, Define And Search

Google integrated a contextual based menu that lets you translate, define and search a selected word for eBooks, available from the Google eBookstore.

Derek Lie, Software Engineer at Google states,

When bookworms stumble across a word we don’t know, we face the classic dilemma of whether to put the book down to look up the word or forge ahead in ignorance to avoid interrupting the reading experience. Well, fret no more, readers, because today you can select words in Google eBooks and look up their definitions, translate them or search for them elsewhere in the book from within the Google eBooks Web Reader—without losing your page or even looking away.

By adding these features, Google certainly has made reading ebooks easier than ever before.

Here’s how you can use these features. Double-click or highlight the word that you want to look up and a pop-up menu appears with the following option: Define, Translate, Search Book, Search Google and Search Wikipedia.


By clicking Define, the pop-up now displays a definition of the word via Google Dictionary.

Google eBook Define


You can translate a single word or several sentences to any of the languages Google offers.

Google eBook Translate


You can also search for the selected text within the ebook itself or across the web.

Google Displaying Promoted Tweets on Real-time Search

Twitter has been in the news for all the wrong reasons this past week. After the #dickbar fiasco, and the recent update of the Twitter API terms to discourage developers from building new Twitter clients, Twitter, under the leadership of Dick Costolo, seems to be charging ahead to turn profitable, at least enough to justify its sky high valuation which recently jumped to $7.7 billion in a private auction.

Unlike Facebook, which seems to have found a way to make money without irritating its users, every time Twitter makes a move towards monetizing the millions of tweets its users create, somehow it always seems to rub off badly with either the users or the developers.

In its latest move, Twitter seems to have convinced Google to display promoted tweets in real-time search results. It seems like if you search for any keyword in Google for which a promoted tweet exists, the promoted tweet is display on top of the results, highlighted as “Ads by Twitter”. Twitter likely plans to jack up the price of Promoted Tweets based on how much additional traffic they gain from real-time search users.

Bing, Google’s closest competitor in the search space, isn’t displaying the promoted tweets in the real-time search results though. We wonder if this is because Google has only the Twitter firehose as a source for real-time search results, while Bing has access to both Twitter and Facebook data streams. Maybe, that’s how Twitter coerced Google into display promoted tweets in results. At this point, it’s still speculation though.

Google Promoted Tweets Ads by Twitter

Google Rolls Out An Algorithmic Update To Remove Spam Sites From Search Results

spam-sitesIn recent times, Google has been pretty busy trying to save it’s face from the raising eyebrows on search quality, spam sites infiltrating Google search results and content aggregators dominating the SERP’s. We told you how messy the entire Google experience can get, even Google News is not free from junk sites and content aggregators.

It’s not that Big G wasn’t listening to the appeals, they announced a big algorithmic change in January 2011 followed by announcing domain filtering in search results.

In fact, Google also released a Chrome extension which allows the user to block specific sites on Google search result pages.

Today Google announced that the Google search quality team has rolled out a major algorithmic change in their ranking system. This change will impact almost 11.8 % of search queries made on Google search and the algorithmic changeis designed such that they will reducerankings for low quality sites which don’t add value to a subject or scrap content from other sites.

At the same times, Google confirms that the new algorithm will make sure that readers can find more original content in search results. Sites who post original stories, write detailed reports and publish thoughtful analysis or research work are surely going to rank higher in the search results for a given query.

We can’t make a major improvement without affecting rankings for many sites. It has to be that some sites will go up and some will go down. Google depends on the high-quality content created by wonderful websites around the world, and we do have a responsibility to encourage a healthy web ecosystem. Therefore, it is important for high-quality sites to be rewarded, and that’s exactly what this change does.says Google in an official blog post.

Google says that the algorithmic update does not entirely rely on the feedback received from the Personal Blocklist Chrome extension. However, the web spam team did consider the data received from the extension and compared it with the sites, which were marked as SPAMby the new Algorithm. In fact, the algorithmic change addresses 84% of the spam sites, which other users have also blocked in the Personal blocklist chrome extension preferences.

What are those sites ? Google hasn’t disclosed the list.

The algorithmic change is launched in U.S only and will be rolled out on other locations over time.

Matt Cutts: New Algorithm Will Reduce Google Search Spam, Expect Better Results Soon

A few days ago Google gave some hints on how Google web spam team is changing the way web pages are ranked and trying to implement a redesigned “document level classifier” in Google search algorithm.  This whole saga of Google search spam got ignited by Jess Atwood’s post at CodingHorror; so if you have missed the details, read that article and our response article on the issue.

Earlier today, Google Engineer Matt Cutts announced that a new algorithmic change has been launched which will rank the content scraping sites lower in search results. Hence, users are more likely to see those sites higher in search results, who wrote the original content.

Matt said that this change was very much  targeted  and geared towards improving the overall experience of users.

Note that Matt says “lower in search results” and not that these scraped content will never appear in a search result page.

Let’s quickly take an example.

We announced an Apple Ipad Giveaway on 23rd december 2010, which received 254 responses and was pretty much successfull.

Some websites thought it’s wise to scrap that content and put their own “mix” to the scraped material. They changed the Title, modified the URL (thinking that adding the author’s name might just work) and added the same sentences, keywords over and over in the meta description. The result is something like this

The first search result points to this site (thanks Google for your algo).

My point is: Some users will still click the second and third links and arrive to that scraped website who have no original content. Because they are not organizing the Giveaway and can never give an Apple iPad on behalf of Techie Buzz. ( Those who need proof, can read this research article on “Eye tracking in  search results” (PDF), learn the facts and then comment on this post).

So what happens is that some users arrive to the scraped website, can’t find what they are looking for and quit. The will simply go elsewhere.

The Result: We lose those prospective readers who are searching our website,  just because Google showed the scraped sites on search result pages. Agree the number of such readers is way less, but it’s never ZERO.

Our Suggestion To Google Web Spam Team

Don’t show the scraped sites at all. Never. I mean “Why ?”

Google Engineers can easily judge whether the content is an exact photocopy of the source website or not, so there is no point in showing these sites on any of the search result pages. Not even on the 99th page.

Learn from Bing

Surprisingly, the same search at shows only one spam link (third one).

This is just an example and I am not saying that Bing is better than Google. But as you can see – Bing shows less number of scraped sites, when you consider a long tail of search.

@Google: We are Praying for better search results.

Google Talks About Search Quality Spam: Improved Results Underway

A few days back we told you how Google search results were being intruded by content farms and “agregators” who try to manipulate search results with black hat techniques. Recently,  there has been much speculation on the way Google search is populated with content aggregators and   scraped websites.

Earlier today, Google Engineer Matt Cutts published a blog post saying “We are listening as more algos and spam detection techniques are on our radar”.

Here is what Matt said:

We have seen a slight uptick of spam in recent months, and while we’ve already made progress, we have new efforts underway to continue to improve our search quality.

Prior to this, Matt explained that Google’s search quality is better than before in terms of relevance, freshness and comprehensiveness. Web spam in English is less than half what it was five years ago and spam in other languages is lower than English.

It’s interesting to note that Google rolled out a PR update earlier today and the blog post also discusses about a redesigned “document-level classifier” being introduced. This classifier will detect on page spammy content, junk and repetitive words that typically appear in blog  comment  forms, thus making it difficult for spam sites to rank ahead of the original source in SERP’s.

In addition, Google has also improved the way hacked sites are shown in search results. Matt says “We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower”.

The blog post didn’t gave any clue on why the scraped sites were able to manipulate search results. However, Matt makes it “crystal clear” that Google takes “action” against sites that violate their webmaster quality guidelines regardless of whether they are running Adsense ads on their sites.

Indirectly, Google clears the air that although they take away 32 % share of all the Adsense money from Publishers, they are strict to ban folks who try to degrade the overall search experience by running Adsense on scraped sites with no original content.

As webmasters, we can only hope. That’s all.

Oh and if anybody from Google is reading this, we have a suggestion. Showcase the spammers on a new “Spam bar” in the right of search results. Way better !

Hey Google, Can We Have Some Meaningful Results? Please

You don’t use Google because it’s “Google”. You use “Google” because it offers relevant and meaningful results.

AdSense Scraper Sites Joke

Image credit

Back in the year 2000 – web search was largely dominated by short web queries and the results were very “generalized”, users had to click through multiple links until they were able to find the exact webpage they were looking for. The search volume was fairly low, hence the requirement for a polished algorithm was not laid forward.

Also Read: Hey Google, Here is How You Can Make Google News Spam Free

Then came Google and web search took a significant turn in it’s “relevancy”, “exactness” and “value”. Google changed the way   folks used to find content, people started writing blogs, created information and Google organized all the information in their “index”.

The sad part – Newton’s third law of motion holds true universally.

The reaction: Google’s way of ranking webpages was huge, so big that it gave birth to a giant search marketing industry. The welcome mat for SEO Firms was laid out, which consequently gave birth to content farms, aggregators and scraped websites. Their entire goal of producing content is from the outside in – find the phrases which are profitable, judge the competition, gain a good amount of links and make your way to the SERP’s.

In recent times, there has been much speculation on the way Google search is populated with content aggregators and scraped websites. TechCrunch, Venture Beat and other tech blogs have put down their observations earlier.

It’s not that only we are saying this. Jeff Atwood from Coding Horror recently nailed down their observations

In 2010, our mailboxes suddenly started overflowing with complaints from users complaints that they were doing perfectly reasonable Google searches, and ending up on scraper sites that mirrored Stack Overflow content with added advertisements. Even worse, in some cases, the original Stack Overflow question was nowhere to be found in the search results!

Why Do Content Aggregators Beat The Source In Search Results ?

Let’s take a look on the factors that influence the ranking of a webpage in search results.

Google says that there are more than 200 signals but practice shows that the most important factors which determine whether you page is going to rank well or not are:

1. The quality of the content
2. Number and authority of the links to your content.
3. The Title Of the page.

Let’s say you wrote an informative article on “iPhone cases” and published it on your blog. Since “iPhone cases” is a profitable phrase, this will alert the content aggregators, link farms and scraper sites running AdSense ads. These guys have set up Google Alerts and other ways to get instant notifications whenever a phrase they are targeting, gets found by Google.

They come to your blog, copy an excerpt from your article, publish a new post on their “Aggregation channels” with a link back to your post.

You think it’s cool? ” “Hey Mike, I just got a backlink from they have 60,000 RSS readers. I am famous !”

Yes, a lot of people will read your thought but not on your blog. The majority will read it on the “Aggregation channel”.

The Result

Since a large portion of the blogosphere is paying attention to that Aggregation channel, who do you think is going to attract more links?

Social Media Influence Aggregators

As soon as the scraped post hits an “Aggregation channel”, a huge number of blogs will start linking to them (not you). They have an enormous amount of social media subscribers, Twitter followers, newsletter readers and getting 6 dozen backlinks to their own article is child’s play.

Yes, you have written that 900 word article but that doesn’t qualify your post to be linked from other sites. Social media influences search rankings and these content aggregators use huge social media influence to dominate SERP’s.

Conrad Sam from Search Engine Land performed an internal study on profitable keywords, in order to find out which search engine (Google or Bing) returns relevant results. The SEL team followed this convention:

  • 5 points were awarded for a good quality result ranking first, 3 for second and 1 for third.
  • 2 bonus points were added for top 3 results being on a highly authoritative site.
  • 5 points were subtracted if the entire first page didn’t contain any good results.

Search Engine Ranking Authority
It’s no wonder that Google is losing it’s ground on relevancy and exactness. Richard MacManus from Read Write Web puts an interesting comment on why blogs and search engines need to embrace a “change”

I can only hope that Google and other search engines find better ways to surface quality content, for its own sake as well as ours. Because right now Google is being infiltrated on a vast scale by content farms.

That pretty much sums it up. Google has been promoting leechers more than the actual publishers and we ourselves have been victim of this problem. You can read more about it on this post.

What is your opinion of the quality of search results in Google? Are you happy with them? Do you think that they need to improve? Are you a victim of RSS scrapers? Don’t forget to tell me your thoughts through your comments.

How to Perform a Location Specific Google Search

There are some situations when you may want to perform a location specific search on Google. For example – you are on a vacation and want to search for restaurants that serve thai food in the same city. Or you may want to look for historic places in a specific country using Google search.

Google makes it very easy to perform a location specific search. For some search queries, Google will automatically detect your location and serve local search results, as shown in the following example:

In the above example, I am using for the search query and seeing the location specific results. If the same query is performed on, the location is not detected.

Google has recently moved the location preferences on the left sidebar on Google search result pages, making it more easier to control your location preferences. If you are using a country specific Google search, your present location will be detected automatically, which can be changed by clicking the locations dropdown.

When you set the new location, the Google search result page will refresh and show local results from the same city (if available). This is very useful when you are performing special searches, some examples are shown below:

Knowing the sunrise time of a destination city

Finding Restaurants from a specific City

Finding Movie Showtimes in a specific city

The Google Public policy blog says

We do our best to automatically detect the most useful location, but we don’t always get it right—so in some cases you’ll want to change the setting. At other times, you may want to change your location to explore information relevant to another area. Meanwhile, Google has become much better at presenting this locally relevant content—so it felt like the right time to make this setting easier to find.

JantaKhoj – India’s First “People Search Engine”

After the Google revolution, we have seen many desi search engines coming up from time to time. Well here is a search engine with a difference. Titled as JantaKhoj, this search website is a people search engine for India which manually collects data from different publiclyavailable sources. This data is then compiled and put together on the web to make it easily search-able.

The main sources of data are records of various govt. departments (which the company chooses not to disclose). The website has data of over 50 million Indians on their records which include address, phone number and information about relatives and neighbors.

The company mainly provides background verification services such as matrimonial verification, domestic help verification driver verification, tenant verification and employee background verification services. The services are charged between Rs.1475 and Rs.2225 per verification service.

You can visit the website and search for your name too. In case you want your data to be removed from the website, you can see this page.

via Pi.In

Microsoft Bing Offers “integrated search and mapping” for Android. Is Awkward.

Well, well, well, isn’t this awkward? Microsoft Bing Community has launched an app for Android users under Verizon that offers integrated search and mappingfor all the Android users that lack this super-special and super-rare service that Google’s phones seem to lack (at least for the folks at Bing, we are assuming).


It also has voice-search capabilities which are hard to find on the Google phones.

Evidently, they are trying their luck at this gapin the market. No puns intended.

Google Testing Real Time Search Results as You Type

Google has been experimenting with search results in the past, and they continue to run several more tests to enhance the search experience for the user. Earlier this year, Google had launched real-time search where they displayed real-time results from .

However, they are now taking the real-time search to another level by displaying search results as you type in letters in the search box. Real time search results are displayed when you type in characters into the search box, they continue to change as you type in more characters or use the autocomplete feature.

The new real-time search results as you type was discovered by a SEO consultant Rob Ousbey and posted on his blog, which you can find here. Here is a video of the real-time search results as you type in action below.

(Source: TechCrunch)