High-energy physicists have pushed the limits of network data transfer to mind boggling speeds. Researchers attending the Super Computing 2011 conference, held in Seattle Convention Center, transferred data in opposite directions to eventually reach a combined rate of 186Gbps over a Wide Area Network. For those of you not quite familiar with the terminology, a Wide Area Network is typically defined as a network that is separated by long geographic distances such as, between a main office and a branch office in another state. The typical Wide Area Network usually ranges from 1.54Mbps T1 or DSL connections to 10Mbps fiber or cable. As you can see, 186Gbps speeds blows the standard network speeds away.
The small team of researchers consisted of members from Caltech and University of Victoria. In the first demonstration the team transferred data from hard disks located at University of Victoria down to the show room floor at more than 60Gbps. This is thought to be a record all its own, but then they transferred data from memory to memory at 98Gbps. They were able to sustain this transfer reaching bidirectional speeds of 186Gbps. University of Victoria Professor and LHC physicist Randall Sobie said:
The 100Gb/s demo at SC11 is pushing the limits of network technology by showing that it is possible to transfer peta-scale particle physics data sample in a matter of hours to anywhere around the world.
Canada’s Advanced Research and Innovation Network (CANARIE) and BCNET, a non-profit, shared IT services organization constructed the production grade network to transmit the data. The data transfer was done using an open source application developed by Caltech called FDT. One of the amazing accomplishments in this was that all of the data transmitted was received on only 4 pieces of equipment on the show room floor. This type of data transfer would have required dozens of servers just a few years ago. The video below describes the type of technology used to make this landmark data transfer happen.
One of the factors driving the need for this kind of speed is the Large Hadron Collider (LHC) at CERN. The amount of data being collected at the LHC is growing rapidly so it is becoming increasingly important to find avenues of transporting this data worldwide at faster speeds. “Enabling scientists anywhere in the world to work on the LHC data is a key objective, bringing the best minds together to work on the mysteries of the universe,” says David Foster, the deputy IT department head at CERN. Hopefully this new technology will lead to innovations to make data sharing in the scientific community a little easier.
According to the Caltech press release, “the key to discovery, the researchers say, is in picking out the rare signals that may indicate new physics discoveries from a sea of potentially overwhelming background noise caused by already understood particle interactions. To do this, individual physicists and small groups located around the world must repeatedly accessâ€”and sometimes extract and transportâ€”multiterabyte data sets on demand from petabyte data stores.” In case you’re wondering, that amount of data is equivalent to hundreds of Blu-ray movies.
More information can be found at http://supercomputing.caltech.edu. You may also want to read more about CERN’s research at the following: