SDSC to venture capitalists: data-intensive supercomputing is here
By By Jan Zverina | 23 Apr 2011
The exponentially increasing amount of digital information, along with new challenges in storing valuable data and massive datasets, are changing the architecture of today's newest supercomputers as well as how researchers will use them to accelerate scientific discovery.
''Digital data is advancing at least as fast, and probably faster, than Moore's Law,'' says Michael Norman, director of the San Diego Supercomputer Center (SDSC) at the University of California, San Diego (UCSD), referring to the computing hardware belief that the number of transistors which can be placed inexpensively on an integrated circuit doubles approximately every 18 months.
According to Norman the amount of digital data generated just by instruments such as DNA sequencers, cameras, telescopes, and MRIs is now doubling every 18 months.
''But I/O (input / output) transfer rates are not keeping pace – that is what SDSC's supercomputers are designed to solve, '' he says.
SDSC, a key resource for UCSD researchers as well as the UC system and nationally, will later this year deploy a new data-intensive supercomputer system named Gordon, which will be the first high-performance supercomputer to use large amounts of flash-based SSD (solid state drive) memory.
Flash memory is more common in smaller devices such as mobile phones and laptop computers, but unique for supercomputers, which generally use slower spinning disk technology.