All posts by Chinmoy Kanjilal

Chinmoy Kanjilal is a FOSS enthusiast and evangelist. He is passionate about Android. Security exploits turn him on and he loves to tinker with computer networks. He rants occasionally at Techarraz.com. You can connect with him on Twitter @ckandroid.

RapidShare Ordered to Filter User Uploads in Germany

After the takedown of the Megaupload Empire, piracy watchdogs have become more active worldwide. Recently, a German court has ordered RapidShare to filter all user uploads. It is a well-known fact that no file-sharing service is free of illegal content. The ruling confirms three verdicts from a lower court, all in cases started by book publishers and music rights groups. Although RapidShare has made early efforts to comply with piracy laws, those efforts do not in any way guarantee a safe passage, when it comes to cases like these.
rapidshare-filter
The case leading up to this ruling was started by the music rights group GEMA. They, along with some book publishers started a case that resulted in a ruling that none of the books or songs from the plaintiff should appear on RapidShare. Clearly, RapidShare has to deploy massive filtering and signature checking mechanisms for files uploaded by its users. There are over 4000 files in the filter list.

The Chief Executive of German booksellers association, Alexander Skipis, celebrated the victory, saying

Internet sites can no longer avoid their responsibilities, and profit from copyright infringing uploads of anonymous users.

However, little did he know that the verdict of this case was contrary to the one passed by the European Court of Justice in a similar case last month. The verdict rejected any possibility of filtering copyrighted content to protect the privacy of users. Clearly, RapidShare will appeal this ruling, given the history of a similar case, which is in favor of them.

RapidShare has issued a press release on the premature celebrations of GEMA.

Baar/Switzerland, 16 March 2012. On 14 March 2012, the Hanseatic Higher Regional Court of Hamburg affirmed injunctive relief sought by GEMA and the publishers Campus and De Gruyter against RapidShare in three separate judgments. Both the German Publishers and Booksellers Association and GEMA issued “jubilant statements” immediately afterwards. In doing so they are conveniently ignoring the fact that it is considered unprofessional to evaluate a judgment before the written reasons for the judgment are on hand. Only then will it become apparent which party can truly celebrate a judgment as a success.

However, the Higher Regional court of Hamburg has issued a press release indicating a possible reason for the plaintiffs’ hectic actions: in the present cases the Court has amended its previous position, according to which RapidShare’s business model was not approved by the legal system. For the first time, the Court has also acknowledged that files only become “publicly accessible” when users publish the links in the Internet. In the past the previous diverging assessment had resulted in extensive obligations, already when uploading a file. Accordingly, the Court now sees the duties of RapidShare in particular in fighting the issue of piracy where illegal files are actually distributed, namely on the respective link pages. That is exactly what RapidShare has already been engaged in for years.

Alexandra Zwingli, CEO of RapidShare: “Of course, we will only make a detailed statement once we have the complete text. However, I am convinced that our tried-and-tested actions against copyright infringements are the right way to go, and I am pleased that the Higher Regional Court of Hamburg has confirmed as much in its press release. This demonstrates that we are not only technological, but also legal pioneers in cloud storage.”

Linux Kernel 3.3 Released with Merge of Kernel Code from Android

The Linux Kernel has reached version 3.3 recently. The latest release of the Linux kernel includes multiple feature improvements and some major changes. The most awaited change in this new kernel is the inclusion of Android kernel code in the Linux kernel. The inclusion of Android kernel marks the first step towards a unified Android and Linux kernel, making it possible for future devices to run a mix of both. While the inclusion of Android kernel code made big news, there were other equally remarkable behind the scene changes in this release.
android-linux-kernel

Like every time, Linux Torvalds announced the availability of the latest kernel release on LKML.org. The release was pushed ahead by a week because of the surprise RC7 release last week. The RC7 was an unplanned release and Linus explained it, saying

I had been hoping that -rc6 would be the last -RC, but no such luck. Things just haven’t calmed down sufficiently for me to feel comfy doing a final 3.3 release without another -rc, so here we are: 3.3-rc7 is out.

With the latest version of Linux kernel, things are finally looking good for the btrfs file system.  In addition, GCC has been updated to include the TI C6 architecture and as always, there are new drivers and bug fixes.

The next version 3.4 of the Linux kernel will sport better power management for the Android code in kernel, which is a mess right now. Moreover, it will also improve upon Sandy Bridge performance.

(image via)

SourceForge Takes Down the Dubious Anonymous OS Linux Distro

The popular online source code repository, SourceForge, has taken down a project that likes to call itself the Anonymous OS. This Project was uploaded on SourceForge almost a week ago and it has grabbed nearly 5000 downloads in this short span of time. While the official channels of Anonymous are rejecting any link with the distro, a Tumblr page has been created to promote the distro and it is making some bold claims.

The Anonymous OS distro is based off Ubuntu 11.10 and comes with the Mate desktop environment. It ships with known hacker and security tools like the High Orbit Ion Cannon, Tors Hammer, John the Ripper, Wireshark, Slowloris and Vidalia. The total size of the distro is 1.5 GB and it is still available on BitTorrent.

The official Twitter channel of Anonymous @AnonOps has rejected any affiliation of Anonymous with the Anonymous OS Linux build on SourceForge.

The Anon OS is fake it is wrapped in trojans. RT

— AnonOps (@anonops) March 15, 2012

With over 300,000 projects under its belt and millions of registered users, SourceForge has a responsibility towards both the user community and the developer community. This has led SourceForge to take down the   project for now, and pursue some answers from the project admin. As dubious as the name and the nature of the project is, it also ridicules the ideology of Anonymous. The very fact that Anonymous are Anonymous can be used so easily against them, and every time that happens, they have to fulfill the social obligation of rejecting dubious affiliations like these.

 

Did the Internet Really Kill Business Cards?

Very recently, I read an interesting article on the LA Times about the demise of business cards. However, the views expressed in the article were very tech-savvy centric and the entire world is not going tech savvy anytime soon.

Business cards have served as an important contact artifact for decades. The exchange of a business card establishes a contact beyond momentary business. Back in the days, business cards were extremely popular and important. There was a whole industry based on the printing of business cards. Even today, companies like Google and Facebook bring out offers to create vanity cards from a Google search of your name and from your Facebook profile respectively. The leading companies in the Internet space would definitely not do something that is out of line with the current times.

The exchange or utility of business cards needs to be examined from a different perspective. In my opinion, the utility of a business card totally depends on the kind of work you do. Not everything happens online (I wish that the world was that way!), and most of the things that do not happen online have legacy systems. These legacy systems are deep rooted into the society and require face-to-face conversation. Take for instance the law business or the agent driven model for insurance. Yes, lawyers and agents do have websites. However, businesses like these are based on trust building and human contact. For legacy businesses like these, the exchange of a business card serves as a touch-point for creating a contact.

However, we cannot overlook the ground realty. Business cards are exchanged in abundance, all right. However, they really do end up, as said by Matt Stevens from the LA Times, “in a shoe box”. I have a number of business cards from various people and businesses, but whenever it comes to getting back in touch with them, I always go and do a quick search online. Maybe I do not deal enough with legacy businesses. It really seems that business cards are facing a slow and tragic death and better availability of information online is responsible for it. Someday, the glorious days of business cards will be over, and we will only see them in movies, as commodities of philosophical ridicule like in American Psycho. However, that day is not close. Not yet.

Twitter Gives Refuge to Posterous, Prevents an Untimely Death

There was a time once when Tumblr and Posterous were in a tough battle for the short-blogging space. The battle is long over. Tumblr has emerged as the undefeated champion and Posterous has barely managed to stay afloat. In the last one year, Posterous shifted its focus from competing with Tumblr, to group conversation. It has worked well for Posterous and it managed to stay in business.
posterous-thumb
However, the latest development at Posterous is great news for them. Twitter just acquired Posterous. The Posterous blog has announced this acquisition saying,

The opportunities in front of Twitter are exciting, and we couldn’t be happier about bringing our team’s expertise to a product that reaches hundreds of millions of users around the globe. Plus, the people at Twitter are genuinely nice folks who share our vision for making sharing simpler.

Along with the technology, Twitter also let in the Posterous team, which includes product managers and engineers.

The Posterous team has promised that all the existing services at Posterous (like Posterous Spaces) will continue to be live and the withdrawal of any service will be announced well in advance. Additionally, instructions for backup of data and other artifacts stored with Posterous accounts will be announced soon. This is a bit confusing, as on one hand, the Posterous team suggests that they will keep all their current offerings live, and on the other hand, they are promising prior withdrawal notices.

It is unclear whether Posterous will meet its fate eventually, but the Posterous team will definitely be a valuable addition at Twitter.

Google Working on Kinect like Technology for Android

It goes without saying that Microsoft has done a wonderful job with the Kinect. It has been successful technologically attracting a lot of developer love, it has broken sales records worldwide from a business perspective and for an end-user, and it has devised the next-gen input system for home-computing devices.

project-natal

However, it will be interesting to note that Microsoft is not the only one working on a gesture and motion controlled input system. Google has filed a patent ten days ago for a similar touch and gesture controlled system, and it clearly says that the patent is for “portable electronic devices”, which can only be considered as cellphones for now. The abstract of the patent says,

Systems and methods are provided for controlling a portable electronic device. The device includes a built-in image-capturing device. The system detects, through the image-capturing device, motions of user finger over the image-capturing device. The system determines a pattern of the detected motions using timing information related to the detected motions, and controls the portable electronic device based on the determined pattern. The system also receives inputs from other input devices associated with the portable electronic device, and controls the device based on combination of the determined pattern and the received inputs.

The patent speaks extensively about detecting motion patterns, and it seems like these motions will be used as gestures. There are references to two distinct motion types in the patent filing. The first one is “touch motion”, which is the typical touch interface on a touchscreen device. The second one however, is a “release motion”, which is the gesture controlled input system Google has in plan. It talks capturing and reading hovering motion alongside other gesture mechanisms.

More details on the patent filing can be found at this page.

(via: slashdot)

After Nevada, California Gives Google’s Autonomous Car a Thumbs Up

Google’s autonomous car took the tech world by storm when it was first spotted in October 2010.  The state of Nevada approved Google’s self-driving cars last year in June and recently; California has joined the list of state willing to have self-driving cars on its roads and has proposed to pass rules for autonomous cars.

Nevada was the first state of choice for Google’s driverless cars because it has ample open space. However, the state of California will not be an easy drive-by for Google. Apparently, Google is not the only one trying out autonomous cars in California. Caltech and Stanford research are also working on autonomous car technologies on their own, so the competition is tough. Although Google has been prompt enough to secure a patent for driverless cars, Volkswagen has already showcased an autonomous VW Passat with an in-house self-driving technology. From a holistic perspective, multiple tech foundries are working on autonomous driving, and Google and Volkswagen are leading in the race.

For Google’s autonomous cars to work in California, some necessary traffic standards have to be set. Kurt Ernst at Motorauthority writes on this development, saying,

A bill proposed by California Senator Alex Padilla would set guidelines for the testing and operation of self-driving vehicles within the state. If passed, Padilla’s bill would require the California Highway Patrol to establish standards and performance requirements for autonomous vehicles operated on the state’s roads.

The idea of having an AI control cars is marvelous in itself, and it will only get better with more and more cars using this technology. At one point, it would be of significant business value for Google to make these cars communicating with each other. This will probably solve traffic problems in many cities.

SPDY Gains More Acceptance with Twitter and Firefox

SPDY is an Open Source alternative to the HTTP protocol, and is being seen as a potential replacement for HTTP. Google has already implemented SPDY across its servers and if you are on Google Chrome browser, you are using SPDY to access Google services. SPDY is required on both the browser and the web-server for speed improvements. The current version of HTTP, HTTP 1.1, is almost a decade old and it was built for the requirements of websites that were a decade old. With SPDY, the web will get faster and will cater to the needs of faster web-applications.
google-logo

SPDY is at its core an application-layer protocol for transporting content over the web. It is designed specifically for minimizing latency through features such as multiplexed streams, request prioritization and HTTP header compression.

SPDY was announced two years ago, and it was also invited to be a part of the HTTP 2.0 standard. With this recent acceptance, it probably got the much-needed attention. Recently, FOSS enthusiast and Google employee, Ilya Grigorik, has spotted that Twitter is using SPDY on its servers and has given ample proof of it.

It is interesting to note that Twitter is not the only going for SPDY. Recently, Firefox has started shipping its browser with SPDY and it can be turned on Firefox nightly of version 11 and 12 via the “network.http.spdy.enabled” key.

Google is working to speed up multiple layers on the network protocol stack. SPDY speeds up the application layer, it has plans for hacking TCP to speed up the transport layer and internet layers, and Google fiber speeds up the underlying physical medium. Although slow, the next internet upgrade is coming, and clearly, Google is driving it.

Apple Using OpenStreetMaps in iPhoto Without Due Credit

OpenStreetMaps has been in news more than once over the last few days. The de-facto name in location-based services, Foursquare switched from Google Maps to MapBox (an implementation of OpenStreetMaps) a week ago. In January this year, StreetEasy switched to OpenStreetMaps after some dissatisfaction with Google Maps and their pricing. However, these were not the only ones opting for OpenStreetMaps. OpenStreetMaps was adopted by many other services for its excellent worldwide maps. Nonetheless, all of them made this transition enthusiastically and openly.

Recently, another major tech giant started using OpenStreetMaps for one of its products. However, their adoption was hushed, with no mention or credit given to OpenStreetMaps. This has upset OpenStreetMaps slightly, and they have written a sarcastic piece saying, “don’t mention it” in essence, which Apple did literally anyway.

OpenStreetMaps writes on its blog, saying,

The desktop version of iPhoto, and indeed all of Apple’s iOS apps until now, use Google Maps. The new iPhoto for iOS, however, uses Apple’s own map tiles – made from OpenStreetMaps data (outside the US).

The OSM data that Apple is using is rather old (start of April 2010) so don’t expect to see your latest and greatest updates on there. It’s also missing the necessary credit to OpenStreetMap’s contributors; we look forward to working with Apple to get that on there.

On one hand, where it is good to see that more and more products are using OpenStreetMaps, it is demeaning at the same time because Apple intends to use OpenStreetMaps only as long as they are in transition from Google Maps to their own mapping service.

Slowloris DDoS Tools Used by Anonymous Infected with Zeus Trojan

The arrest of Megaupload’s Kim Dotcom has upset Anonymous greatly, and they have been busy ever since the Megaupload takedown incident. In protest, the Anonymous took down the US Department of Justice website, a number of other record label websites and the Federal Bureau of Investigation website. This was their single largest attack ever.
anonymous-logo
However, a lesser-known fact has surfaced recently. Symantec studied the DDoS tools used by Anonymous, and found that the version of Slowloris they were using was in fact, infected with a Trojan itself!

Robert Hansen who goes by the alias RSnake wrote Slowloris. It is extremely effective for DDOS attacks on low bandwidth.

After Megaupload was shutdown, Anonymous circulated a list of tools to use for hacktivist operations. However, they (seemingly unintentionally) link to a remastered version of the Slowloris tool. On discovery of the exploit, Symantec said,

Not only will supporters be breaking the law by participating in DoS attacks on Anonymous hacktivism targets, but may also be at risk of having their online banking and email credentials stolen.

Elaborate efforts have gone into shutting down Zeus but it keeps coming back always. Riding on the rage of the people against the Megaupload shutdown, the Zeus command and control center gobbled up bank account information, email accounts, cookies and a lot more.

After the matter became public, the link to Slowloris has been removed and it has definitely alerted the victims of this situation. Over the last few days, we will see many fresh OS installs and bank and email account credential changes. Will the Anonymous take revenge? Will we get to see a Zeus vs. Anonymous now?