Tag Archives: Supercomputer

IBM Supercomputer Starts Swearing

The IBM-developed artificial intelligence computer system named Watson is capable of answering questions posed in natural language. It was specifically developed to answer questions on the TV quiz show Jeopardy! In 2011, the supercomputer competed on the sitcom against two former winners and won the $1 million prize.

Eric Brown, a research scientist with IBM, is responsible for the creation and tutoring of Watson. Its purpose was, as an artificial intelligence, to beat the Turing test. This means that if Watson were to chat with a human and that person could not tell whether the correspondent was man or machine, then Watson would pass the test.

It caused havoc after making obscene outbursts after memorizing the contents of the Urban Dictionary. The website contains slang that is part of informal conversation today, but is not considered appropriate for polite conversation. So great was the damage that Watson’s programmers had to wipe out its memory after they could find no way of stopping the supercomputer from swearing profusely.

As Watson formulates replies based on the working of several supercomputers together, it had put together some words and started responding to many of the questions posed with bulls***.

The original article has reported it as follows,

Watson couldn’t distinguish between polite language and profanity – which the Urban Dictionary is full of. Watson picked up some bad habits from reading Wikipedia as well. In tests it even used the word “bulls***” in an answer to a researcher’s query.

Ultimately, Brown’s 35-person team developed a filter to keep Watson from swearing and scraped the Urban Dictionary from its memory. But the trial proves just how thorny it will be to get artificial intelligence to communicate naturally. Brown is now training Watson as a diagnostic tool for hospitals. No knowledge of OMG required.

For the IBM programmers working on Watson, that was certainly a very interesting day at work.

Presenting Deus: A Full Blown Simulation of the Entire Universe

Scientists have been able to recreate the entire Universe inside a computer for the first time ever. A simulation running on a supercomputer, tracking a mind boggling 550 billion particles as they evolve, has been able to recreate the structure of the Universe right from the Big Bang to the present day.

Simulating the Standard Model of Cosmology

This is the first in the series of three simulations to be carried out on GENCI’s new supercomputer, CURIE at CEA’s TGCC (Tres Grand Centre de Calcul) performed by researchers from Laboratoire Univers et Theorie (LUTH). This takes into account the standard model of cosmology with the cosmological constant built in. Successive runs will improve upon this result with more data, especially about the distribution of dark matter and dark energy. The project, called Deus: full Universe run, will seek answers to the cosmological questions in a way similar to what the LHC follows in order to get answers.

Comparing Deus' size to previous simulations! Bottomline: Deus is HUGE!

Why simulation?

The physics at the LHC is massively complicated by the presence of so many particles and so many end states of a certain collision. It is impossible to analytically solve for the end state, so scientists use models before they begin an experiment. These simulations reveal what the most likely result of a certain collision is given certain parameters and bounds on certain numbers. The actual run either confirms the simulation, or discards it. This is a far more efficient process than reconstructing the interaction by looking at the end states, which is the other alternative.

The Deus simulation does something like that. They let the 550 billion points evolve and see what the end state is. This has enabled them to count the number of galaxy clusters which are more massive than a hundred thousand billion solar masses (that’s VERY heavy, by the way) and the number comes out to be 144 million. The first galaxy cluster formed 2 billion years after the Big Bang, according to the simulation. It also shows the most massive galaxy cluster – with a mass of 15 quadrillion (or 15 thousand trillion) solar masses!

 

Relics of the Early Universe

The simulation also revealed fingerprints of the inflationary era in the form of fluctuations in the Cosmic Microwave Background Radiation. If the Big Bang and inflation is true, then there must be radiation left over, which is constantly weakening. This permeates all of the space in the Universe, thus the name Cosmic Microwave Background (CMB). It is believed that some quantum fluctuation, growing under the effect of gravity, gave rise to the galaxy and clusters we see today. The CMB was studied thoroughly by the WMAP studies. They also showed up in the simulation.

Where are we? That dot - that single dot - is the entire Milky Way!

The simulation also confirmed the presence of dark matter and gave a hint of how it might be distributed throughout the Universe. Present in this primordial virtual cosmic soup is the Baryon Acoustic Oscillations or BAO. This might be the answer to the long standing problem of baryon asymmetry – why matter outnumbers anti-matter in the Universe, whereas they should have been produced in equal numbers in the Early Universe.

Computing power – the sky is the limit

CURIE is one of the largest supercomputer facilities in the world. The whole simulation has taken a few years to put together. The whole project is expected to use more than 30 million hours (or 3500 years) of computing time on all CPU’s of CURIE. The amount of data processed comes out to be 150 PB (peta bytes). This amounts to all the data on 30 million DVD’s. State-of-the-art compression technology has allowed researchers to reduce this entire jungle to 1 PB.

Future

Two more simulations are to follow! They will test out rival cosmological models. The simulation is also expected to reveal structures we have not been familiar with before. This will provide scientists a search parameter for current projects like PLANCK and future ones like EUCLID.

More info at this CNRS press conference: http://www2.cnrs.fr/en/2013.htm

Japanese Supercomputer Demolishes Own Speed Record

A Japanese supercomputer has just smashed its own processing speed record, becoming (and remaining) the world’s fastest supercomputer. Japan’s K-computer’ held the record of 8 quadrillion (a quadrillion is a thousand trillion, a petaflop, if you prefer) calculations per second. It has a brain consisting of 88,000 processor microchips and now clocks in at a mind-boggling 10 quadrillion calculations a second, over its 8 quadrillion record at a stunning 93% accuracy. An ordinary desktop, having two or four microchips units, clocks in at about a gigaflop (one thousandth of a trillion), which is a million times lesser than a quadrillion.

The K-computer. The image was released by Riken on Wednesday. (Courtesy: Riken)

The K-Computer

The supercomputer was designed at Fujitsu, in collaboration with the supercomputers R&D wing of Riken, specifically to achieve this landmark. The name derives from the Japanese word kei’, which means 10 quadrillion. The name appears extremely successful now that the computer has achieved the 10 petaflop mark.

The Benchmark

The previous record of 8 petaflops was also held by K-computer, as mentioned. The score and rating is given by the LINPACK benchmark. It aims to calculate the speed by giving the computing machine an NxN system of linear equations of the general form Ax = B to solve. The standard procedure implemented is the Gaussian elimination method with partial pivoting. The system’s floating point computing power can then be judged and measured in megaflops.

Great Achievement

This is a momentous achievement of the island country, especially given what it has been through over the last year. Ryoji Noyori, the president of Riken, said:

The K Computer is a key national technology that will help lay the foundation for Japan’s further progress

Ten petaflops is mindblowing! It will be interesting to see the giant solving real problems in the sciences in the near future.

MIT And Harvard Students Go Up Against IBM’s Watson

After trouncing people at Jeopardy!, IBM’s supercomputer Watson is all set to take on the bright students from MIT and Harvard as it faces off against them in a trivia match. The competition will be held at the Harvard Business School’s Burden Auditorium, tomorrow, i.e. on 31st of October, 2011. The competition is called the IBM Watson Challenge.

Watson: The Genius Giant

The competition will be preceded by a symposium about Watson’s creation and future of technology at the MIT Media Labs titled The Race Against the Machine: The Future of Tech. The challenge is aimed at showing how technology can change, and is indeed changing, business perspectives, said Professor Erik Brynjolfsson of MIT. The Symposium will also have a keynote address by David Ferucci, the father of Watson.

The competition and a bit of rivalry

Coming to the actual competition, there will be three teams one from MIT Sloan, one from Harvard Business School and the other being Watson fighting it out in the middle. The MIT Sloan team was chosen via playoffs. The Harvard students were chosen by two Jeopardy alumni. Each team has three students. As for Watson, it won’t be the full-fledged version, but a toned down one. IBM assures that it will be just as competitive.

Watson aside, the rivalry between MIT and Harvard will definitely be there. The palpable tension of the friendly competition is evident from the statement by Brynjolfsson, who wants a large number of MIT peers to attend the event, so as not to be outnumbered by the hosts Harvard. As many as 200 MIT students are expected to attend.

The MIT Center for Digital Business is sponsoring the event. We sign off by saying what Brynjolfsson said:

The technology is changing the world.

How true!

Source: MIT’s The Tech  http://tech.mit.edu/V131/N48/watson.html

‘Jeopardy’ Champion Supercomputer Watson Set To Tackle Medical Problems

After Jeopardy, IBM’s Watson is ready to face challenges from the real world. It is all set to fulfil its intended destiny.

Watson clobbering other participants on Jeopardy

No More Playing Games

Watson is best known for walloping human Jeopardy champions at their own game. However, this is just the tip of the Iceberg, as far as Watson is concerned. Originally conceived to serve as a fool-proof database for medicine and healthcare, Watson has been picked up by WellPoint Inc. to serve as an invaluable source for patient information, medical history, known treatments, operational procedures and much more. Watson will be able to delve into its deep memory bank, which will be fed with a huge library of medical books and journals, to come up with a solution in seconds.

The computing behemoth

Though it is obvious what great help this can be, Dr, Sam Nussbaum, Wellpoint’s chief medical officer puts it nicely:

Imagine having the ability within three seconds to look through all of that information, to have it be up to date, scientifically presented to you, and based on that patient’s medical needs at the moment you’re caring for that patient.

No, Watson is not here to replace human doctors, but it can support them really well. Watson’s great computing speed will also be of great help to cancer research and ontological operational procedures.

The amount for which Watson’s services are being availed for is unknown. Watson’s $1 million, won in Jeopardy, was given to charity.

This is the beginning of the road for Watson, says IBM. Watson will then be able to take up challenges from other human sectors, like finances, public works and security.

Milky Way Galaxy Simulated For The First Time After Nine Months on Supercomputers

It took nine months to reproduce our galaxy, but it was well worth the effort! Researchers at the University of California, Santa Cruz and the Institute for Theoretical Physics in Zurich created this stunning reconstruction of the Milky Way galaxy by running a simulation on a supercomputer for nine long months. The result is not only beautiful, but also significant in scientific terms. And just for the records, this is the first time such a simulation has been achieved!

The Eris Simulation (Courtesy: UC Santa Barbara)

 

There have been many previous attempts. Every one of them resulted in failure, usually ending up with a huge central bulge. Javiera Guedes, the first author of the paper on the simulation, says it better:

Previous efforts to form a massive disk galaxy like the Milky Way had failed, because the simulated galaxies ended up with huge central bulges compared to the size of the disk

The paper has ben submitted to the Astrophysical Journal and has been accepted for publication. The simulation is remarkably close to the Milky Way. The authors call their simulation Eris’. Take a look as to how close it is to the actual thing in the picture below:

Comparison between the Eris simulation and the Milky Way Galaxy

Cold Dark Matter

The model simulation is important for the support it lends to the ‘cold dark mattertheory of cosmology. Dark matter is a hypothesis used to explain the rotation of galaxies amongst many other things like Cosmic Microwave Background Radiation. The amount of matter we see in the galaxy cannot provide enough gravitation to hold the spinning galaxy together, so scientists postulated the presence of another type of matter one which does not interact with other matter at all, but provides the necessary gravitational pull. Since, it doesn’t interact and cannot be seen’, it is called Dark Matter. There are many models for Dark Matter too. One of them involves particles moving at low speeds, or are cold’.

The key to the success of this team, where many previous attempts have failed, has been the correct simulation of the star formation process in real galaxies. Star formation happens in gas clouds in clumps in the galactic region. These pockets are supported by Dark Matter. Dark Matter halos create gravitational wells, or regions where the gravitational potential is low. These are the regions that matter can reside in and are the hotbeds of gas clouds.

What took nine months?

The remarkable success of the team was the amazing resolution they could achieve. Resolution means tracking several stars and simulating their interactions with each other, both extremely tough jobs. NASA’s Pleiades supercomputer and other supercomputers at UC Santa Barbara and the Swiss National Supercomputing Centre came to the rescue, but together they took nine months to process the data.

Simulations are always satisfying since they assure us that what we know is not wrong. This one is a strong case in point.

Speculation: NSA Building Exaflop Supercomputer?

The United States Government’s National Security Agency (aka the where-privacy-goes-to-die agency) is apparently building a new supercomputer called the for its High Performance Computing Centre. The supercomputer will cost about $895.6 million, as revealed by unclassified documents. The supercomputer is to be built at the headquarters of the agency in Fort Meade, Md. and is slated for completion by 2015.

NSA

The NSA is a surveillance organization (to use a nonspecific and broad generalization) that has been operating since 1952 and is responsible for the decryption of foreign intelligence and the safeguarding and encryption of USA’s domestic signals. The agency has a history of using supercomputers, starting with the purchase and use of one of the first Cray supercomputers (The Cray X-MP/24) which is now decommissioned and is on display at the National Cryptologic Museum.

While exactly how large this computer that the NSA is building is unknown, it is very likely that the computer will be able to perform at 1 exaFLOP. A FLOP, or FLoating point OPerations per Second is a measure of how fast a computer is. It is basically the number of floating point calculations performed in unit time by the computer. A simple hand-held calculator is about 10 FLOPS on an average to show instantaneous results.

An exaFLOP is 10 followed by 18 zeroes (10^18)

In comparison, the combined computing power of the top 500 supercomputers in the world is about 32.4 petaFLOPS (32.4 x 10^15). That is, the new supercomputer being constructed by the NSA is about 31 times faster than the top 500 supercomputers in the world taken together.

However, all this is still speculation, garnered by the power requirements for the new computer about 60 megaWatts. The calculation is based on the Sequoia BlueGene/Q IBM supercomputer that is also under production that needs performs around 20 petaFLOPS and needs 6 megaWatts of power.

Of course, the NSA needs more computing power to sift through all the emails, phone calls and messages we send each day, right?

How to Build Your Own IBM Watson

If you have been following the news keenly, you might know that the IBM-built Watson artificial intelligence supercomputer trounced two of game show Jeopardy!‘s most famous contestants recently. Developed under researcher David Ferrucci, the Watson supercomputer trounced Brad Rutter (winner of the biggest all-time money on the show) and Ken Jennings (record holder for the longest championship streak) in a two-game combined point match. All this is known to you, reader of Techie Buzz. But have you felt the urge to actually make your own supercomputer in your basement, provided you have enough money for the enterprise?

Watson's_avatar

If you did, then you have come to the right place. Tony Pearson, Master Inventor and Senior Managing Consultant for the IBM System Storage product line at the IBM Executive Briefing Center in Tucson Arizona, has written up an easy-to-follow article on how to build your own supercomputer, called the “Watson Jr.. The system incorporates 3 host servers (as opposed to the senior Watson’s 90 servers) and will approximately have 1 terabyte of storage in total. The supercomputer might not be as fancy as the Jeopardy! winning artificial intelligence but it does promise to be a fun summer project if you have enough time and money to invest in (and, of course, a large-enough space to house the computer in).

The basic needs of this computer are as follows:-

  • Three x86 hosts, with the following:
    • 64-bit quad-core processor, either Intel-VT or AMD-V capable,
    • 8GB of DRAM, or larger
    • 300GB of hard disk, or larger
    • CD or DVD Read/Write drive
    • 1GbE Ethernet
  • Computer Monitor, mouse and keyboard
  • Ethernet 1GbE 4-port hub, and appropriate RJ45 cables
  • Surge protector and Power strip
  • Local Console Monitor (LCM) 4-port switch (formerly known as a KVM switch)

I can see your hands itching to get to the meat of it, so do check out the tutorial on IBM’s website before going shopping!

Watson Supercomputer Pwns Humans in Jeopardy!

It’s all over now. The massive   IBM computer, named Watson, has destroyed its two human opponents in the last round of this epic 3 day Jeopardy! battle.

watson-and-opponents

This adds more weight to the belief that computers will soon be running the world, and humans will be obsolete. Watson is made up of 100 IBM Power 750 server units and 15 terabytes of onboard ram.

watson-servers

Here’s what a IBM blogger said about the final moments of the game.

The contest between man and machine on Jeopardy! was decided when IBM’s Watson computer landed on the second Daily Double on day three.   The clue was: This two-word phrase means the power to take private property for public use as long as there is just compensation.Watson’s response: What is eminent domain?

The answer eminent domain, is now also the status of Watson’s tally versus his merely human opponents. Watson scored $77,147, compared to $24,000 for Ken Jennings and $21,600 for Brad Rutter. Jennings admitted defeat as he wrote the following under his final answer:

I for one welcome our new computer overlords.

Here’s a video from day 2 of the show.

So what’s next for computers? They already beat us at chess and Jeopardy. Will a computer win the Indy 500 next? It’s possible, since Google has already developed cars that drive themselves.

I’m guessing that the only way to beat these smart machines is to join them. Are you ready to become Human 2.0 ?

Watson Supercomputer Ties in First Jeopardy Game Against Humans

IBM_Watson_AvatarValentine’s Day wasn’t all hearts and flowers, as a massive IBM computer faced off against two human opponents on the popular Jeopardy! TV game. IBM has been working on a computer that not only understands human language, but can answer riddles based on it’s ability to search a huge database and determine the most likely answer.

The computer is called Watsonand is made up of 100 IBM Power 750 server units and 15 terabytes of onboard ram. Here’s what IBM said about Watson, just before the first game:

After seven years of research and planning, thousands of hours of testing and over fifty champion-level sparring matches, Watson is finally ready to face the two greatest Jeopardy!champions in history – Ken Jennings and Brad Rutter.

This contest is scheduled to run three days, and after the first day, Watson has won $5000, is tied for first place with Brad Rutter, with Ken Jennings in third place, with $2000. At the end of the show, it was obvious that Watson was bright and fast, but he has a few flaws that kept him from running away with a big win.

Here’s a video of the first show. It may not be online long, because it was put up by someone without permission from Jeopardy.

There’s no doubt now, that computers are getting smarter. Many people believe that by 2045, computers will be running the world, and humans will be obsolete. Due to recent discoveries in electronics, I’d be tempted to put the date at 2040. Move aside Big Brother, it’s only a matter of time before Skynet wakes up.