Tag Archives: Hardware

Magnetic Bacteria To Build Computers of the Future

Doctors and aeronauts are old story; the latest involves hardcore microchip manufacturers turning to nature for inspiration. They are looking towards magnetic bacteria for the manufacture of the smallest chips. Magnet making bacteria is the next big thing.

The Magnetic bacteria (Photo Courtesy: University of Leeds)

Microbe with a magnetic personality

A team comprising researchers from the University of Leeds and Tokyo University of Agriculture and Technology have been studying microbes that ingest iron, becoming magnetic in the process.

The attractive personality belongs to Magnetospirillum magneticum, a predominantly anaerobic bacterium living at bottoms of ponds or lakes. There, instead of oxygen, which is scarce, they derive their energy from ingesting iron and then making magnetite, the most magnetic mineral that occurs naturally. Furthermore, the bacteria follow the Earth’s magnetic field lines, just like a compass.

The Magnetospirillum magneticum

The idea is this – you can direct the bacteria by changing the local magnetic field lines. Using this, you can grow magnets of very specific shapes by arranging the magnetic bacteria in specific shapes.

All for Moore’s Law

In a bid to keep Moore’s law alive in microchips, scientists are thinking of building upon this simple idea and create hard drives and even wires using these critters. Today’s nanotechnology is still having trouble coping with the pace at which Moore’s law would like it to attain, despite the breakneck speed.

Lead researcher Dr. Sarah Staniland of Leeds University says:

We are quickly reaching the limits of traditional electronic manufacturing as computer components get smaller. The machines we have traditionally used to build them with are clumsy at such small scales.

The pioneering study appears in the Nanotechnology journal Small.

PS3 Firmware Update 4.00 Removes Support for the PSP. Oh Dear.

I have this theory that in the Tokyo headquarters of Sony Entertainment Corporation, a Gríma Wormtongue-like figure ominously whispers and fills in the ears of Kaz Hirai, who is reclining like King Theoden. (If you do not understand what I am talking about, then you really must read The Lord of the Rings. Please do so immediately!) Otherwise there is no explanation as to why some of the best features of the PlayStation 3 (PS3) console have been serially diluted over the years. The original PS3 was a brilliant console with backwards compatibility and hardware emulation for PS2 and older consoles, along with the ability to install a custom OS (Linux, to be precise) and connect a PlayStation Portable (PSP) and transfer compatible games to it. In the next sub-iteration of the PS3, called the PS3 Slim, the hardware emulation was replaced by software, and eventually a firmware update permanently removed the ability to install Linux on the system.

PS3

Now, it seems that firmware update 4.00 adds support for the PS Vita (due to be released on December 17th in Japan and parts of Asia) and, surprisingly, removes support for the legacy PSP.

Shacknews reports that the glitch(we are giving Sony the benefit of doubt here) removes the Copy function of every compatible game from the Xross Media Bar (XMB, the UI of the PS3 and the PSP) on the PS3’s screen, effectively disallowing the copying of a legally downloaded game on to a legally allowed secondary platform.

We await more news on this, but I really hope for Sony’s sake that it’s a glitch and not a feature.

Microsoft Touch Mouse: How Microsoft Took a Concept from Research to Product

It is fascinating to learn that it took a collection of prototypes, collaboration between transatlantic teams, and a lot of user testing to bring Microsoft Touch Mouse to market. The Microsoft Touch Mouse project was unusual compared with other hardware-development projects, because it combined multiple disciplines in a tightly integrated way.

Microsoft Touch Mouse

The joint research effort between Microsoft Research Redmond, Microsoft Research Cambridge, and Microsoft’s Applied Sciences Group introduced five different multi-touch mice prototypes. The research paper, titled Mouse 2.0: Multi-touch Meets the Mouse, presents novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Each prototype explored a different touch-sensing strategy that influenced the design of different mouse form factors and their interaction possibilities.

Humans are naturally dexterous and use their fingers and thumbs to perform a variety of complex interactions to a high precision. The traditional computer mouse design, however, makes little use of this dexterity, reducing our hands to a single cursor on the screen. Our fingers are often relegated to performing relatively simple actions such as clicking the mouse buttons or rolling the mouse wheel. With the emergence of multi-touch, we now have the opportunity to manipulate digital content with increased dexterity.

image

FTIR Mouse applies the principle of frustrated total internal reflection to illuminate a user’s fingers, and uses a camera to track multiple points of touch on its curved translucent surface

image

Orb Mouse is equipped with an internal camera and a source of diffuse IR illumination, allowing it to track the user’s hand on its hemispherical surface.

image

Cap Mouse (short for capacitive mouse) employs a matrix of capacitive touch-sensing electrodes to track the position of the user’s fingertips over its surface.

image

Side Mouse rests under the palm of hand, allowing fingers to touch the table surface directly in front of the device. These are sensed using an internal camera and IR laser.

image

Arty Mouse is equipped with three high-resolution optical mouse sensors: one in the base, which rests under the user’s palm, and two under the articulated extensions that follow the movements of the index finger and thumb.

 

The research team intended to refine their prototypes, both ergonomically and in terms of their sensing capabilities. Therefore, Microsoft Hardware decided to get behind the research, and a team was formed to bring a multi-touch mouse to market. The close collaboration between the hardware team and Microsoft Research in both Cambridge and Redmond went beyond just technology transfer.

The design of the final form factor required sculpting and testing of hundreds of models. The team also examined user interactions and evaluated the kinds of gestures that made sense. The multi-touch gestures are designed to amplify your experience with Windows 7, and are optimized for window management. The design also involved a challenge requiring that users should be able to operate the device using classic point-and-click interactions as well as the newly developed set of multi-touch gestures.

The delightful, fluid desktop experience of Microsoft Touch Mouse is a testimony to the value of the value of quality research to explore new possibilities.

Microsoft Hoping To Reduce Xbox Production Costs With New Motherboards

Till a few months back I used to look at new job openings within Microsoft to find information about their new products, now I am searching for a job. Going through the new positions, I came across one where Microsoft talks about updates to the Xbox.

There is little known about what Microsoft is planning. The Xbox 360 has so far been getting incremental hardware upgrades and many are thinking about how Microsoft will be furthering the boundaries of innovation in entertainment with a new Xbox – the Xbox 720. A recent job opening describes Microsoft’s plans to reduce the (production) costs of an Xbox. The Xbox team hopes to get this done by designing a new motherboard for the Xbox.

For those keeping tabs, with the Xbox 360 Slim or Xbox 360 2010  Microsoft introduced a new motherboard design codenamed Vejle. Vejle’s single chip design (CPU+GPU+Memory) allowed engineers to improve the cooling for the console and also with the on-chip memory, one could access more features without an HDD or memory card. Microsoft is looking to improve this design. The job description explains:

 The team is responsible for the design and aggressive cost reduction of the console throughout the life of the product as well as expanding the market for the console in derivative products.

The responsibilities of this position are focused on specifying, designing (schematic capture, PCB layout, BOM, cost analysis), implementing and verifying subsystems on the Xbox motherboard. This includes development of subsystem requirements by working with team members evaluating different solution options for functionality, cost and risk, developing the solution, implementing it, verifying it and supporting it in production. […]

The hardware design for computers has seen radical changes and it is now becoming an engineering challenge to keep systems cool, reduce size and perform better. It will be interesting to get a look at how Microsoft engineers achieve this.

Canonical Releases A Component Catalog To Help You Build A Computer Which Just Works With Ubuntu

When we build a PC in which Linux is to be the main OS, we generally have to watch out for hardware that does not work well in Linux. This means searching in Google and going through forums, blog posts etc.

Knowing that this is generally a difficult task for most users, Canonical started the  Ubuntu Certification Program for hardware last year. Originally the program certified only complete machines – that is laptops, desktops and servers – as Ubuntu ready and does not deal with the components that goes into them.

Today, though, Canonical has decided to extend it to the components as well and has released a database of over 1300 components – from processors to keyboards – which will just work with Ubuntu. Canonical compiled this list using the list of Ubuntu certified laptops, desktops and servers, and their experience with working on servers for enterprise.

This is indeed a very useful list for not only Ubuntu users, but also Linux enthusiasts everywhere. This means we now have a centralized database from which we can make sure that our next machine will work well with Ubuntu (or Linux in general).

This is what Victor Palau, Platform Services Manager at Canonical, said

There has not been a comprehensive, up-to-date freely available catalog like this for a long time. By making this open and easily searchable we want to speed the component selection for Ubuntu machines, and allow us and our partner manufacturers to focus on the value-added user experience.

While the list is great from the normal user’s point of view, some proponents of free software may complain that Canonical has not made any distinctions between hardware for which open drivers are available and those for which only proprietary drivers are available. The recent decision from the Debian community to remove the proprietary firmware from the kernel of Debian 6.0 “Squeeze” indicates that people still care about this issue.

Personally, I too think that it would have been better to separate the hardware with proprietary drivers from those with open drivers. But Canonical has done a good job with this database and we should not let that spoil the mood.

You can view the components catalog here.

In case you want to see the list of certified machines, you can see the here.

NVidia’s Tegra 3, With a Quad-Core GPU, Might Be Announced at the MWC

At the recently concluded CES 2011, nVidia’s Tegra 2 wowed us with its impressive computing capabilities. Unsurprisingly, nVidia is already working on its next chip for handheld devices – Tegra 3. Nvidia’s general manager of mobile business, Michael Rayfield, hinted to Hexus that Tegra 3 might be announced at the Mobile World Congress, which is scheduled to be held next month.

Nvidia-Tegra

We have already seen the huge performance boost that Tegra 2 can deliver. Although the specs for the upcoming Tegra 3 chips are not available, expect similar leaps in performance from it. The rumors going around suggest that it will have a quad-core GPU and a dual-core CPU (ARM A9).

Modern chip fabrication process has reached a level of sophistication that makes it possible to pack in tremendous amount of processing power on even on the tiniest of boards. Tegra 3 will probably use a 28nm fabrication process. However, the biggest bottleneck for mobile computing is likely to be power storage. Batteries have simply not been able to keep up with the rapid improvements in other hardware components. The challenge for the manufacturers is to ramp up performance without requiring additional power. Ultimately, this will be the factor that will determine how quickly we will see quad-core and octa-core chips appearing on our mobile phones and tablets.

Rogers Releases Palm Pre for $99

The Palm Pre 2 is now available on Rogers Wireless in Canada, for the low price of $99 on a 3 year contract. The Pre 2 is the newest device from HP and Palm running WebOS. This iteration of hardware comes with a 1GHz processor, 512MB of RAM and the usual connectivity options including GPS, WiFi and Bluetooth. The Pre 2 is the successor to the Pre Plus and the Pre, before that. Rogers adds a heavy subsidy when going from a 2 year up to a 3 year term. The off-contract price is $449.99, $399 for a 1 year and $349 for a 2 year.

The Pre 2 is also available online, unlocked and directly from Palm for $449 USD.

AMD India Develops Ontario – New Fusion Chip for Netbooks and Tablets

AMD-Fusion The Indian wing of AMD (Advanced Micro Devices) has developed a sophisticated fusion chip called Ontario that is claimed to be three times more powerful and economical than its competition. A fusion chip is one which includes both CPU (Central Processing Unit) and GPU (Graphics Processing Unit).

These super-small Ontario chips are targeted at tablets and netbooks, and are the first in a series of fusion chips planned by AMD. The chipmaker claims that Ontario will be able to deliver 90% of the performance of today’s chips in less than half of the silicon area, which should theoretically lead to less heat generation. The Bobcat based Ontario chips are extremely power efficient with a rating of 9W. Each core is capable of running on less than one watt of power.

AMD-Fusion-Ontario
AMD Ontario (Img Source: The Hindu)

Ontario was designed by an 86 member team from India who worked on it for two years. Ontario powered devices, including a Windows 7 tablet from Acer, will begin appearing in 2011.

Adios ATI, It Will Be All AMD from Here On

AMD has announced that it will retire the ATI brand later this year. Four years ago, AMD shocked everyone by acquiring ATI for $5.4 billion. The merger almost cost AMD dearly. The massive expense was financed through $2 million in loans and 56 million shares of AMD stock. Most pundits believed that AMD had made a blunder by coughing up so much for ATI, when it was suffering major losses in the chipset business. However, the merger also brought tangible benefits to the chipset giant. ATI’s strong performance in the recent past has helped AMD boost its brand name, and now AMD is preparing to ship Fusion APUs (CPU+GPU on a single die).

AMD-ATI-Branding

Nevertheless, AMD obviously believes that it stands to gain significantly by retiring the ATI brand name. Come 2011, all products from the Sunnyvale based semiconductor company will bear only AMD branding. According to AMD, an internal survey revealed the following key points:

  1. AMD brand preference triples when the person surveyed is aware of the ATI-AMD merger.
  2. The AMD brand is viewed as stronger than ATI when compared to graphics competitors (presumably NVIDIA).
  3. The Radeon and Fire Pro brands themselves (without ATI being attached to them) are very high as is.

The last point suggests that ATI products are well recognized even without the ATI branding. However, even more crucial is the revelation that most of the consumers consider AMD to be a more significant competitor to Nvidia than ATI. The GPU market has long been dominated by ATI and Nvidia.

Much like 3dfx Voodoo, ATI will always be remembered by geeks and hardware aficionados around the world. Hardcore ATI fans might be disappointed with AMD’s announcement, but at least ATI is going out with respect.

World’s Fastest CPU IBM z196 Clocks 5.2 GHz

I purchased my own computer back in 2000 something and that gorgeous beauty had a Intel Pentium3 1.1 GHz processor. Since then, I have changed several computers which have simply grown in speed and processing power.

Using the same 1.1 GHz processor would definitely make my work look like a chore today, however, thanks to hardware manufactures like Intel and AMD, I am assured of getting a better experience every time I use a PC. But here is something that will engulf all the currently available CPUs in the market, an IBM processor name z196 which clocks in 5.2Ghz.

The IBM z196 processor can execute 50 billion instructions per second, which makes the current generation of desktop processors look like a pauper. The CPU also has 1.4 billion transistors. The CPU also has a 64KB L1 instruction cache, 128KB L1 data cache and a 1.5 L2 cache.

However, don’t expect this CPU to make to desktops anytime soon, it costs several hundred thousands of dollars and was designed for IBM’s mainframes.

(Source: Tom’s Hardware)