Simultaneously, jet stream-influenced currents moved north from Spain to Great Britain, while ocean surface temperatures all along the U.S. West Coast climbed rapidly. Soon, most of Asia was swimming in bathwater; a few seconds later, cold water began to inch away from the poles.
Watanabe, vice president of high performance computing at NEC, was showing a graphical recreation of changes in global ocean surface temperatures modeled by the Earth Simulator, the massive supercomputer created by the company and various agencies in the Japanese government.
His presentation at this week's Hot Chips conference at Stanford University also included a model of worldwide precipitation over a 16-day period. Huge swaths of clouds blanketed the southernmost and northernmost portions of the globe. Those two small clouds near Taiwan? They comprised a twin typhoon captured in the data swipe.
No wonder the U.S. federal government is freaked out by this machine.
The value of many technological achievements--Bluetooth, the Better Pasta Pot, Web sites that tout "my" personalization services--remains questionable, but it's tough to not to be awed by supercomputers. During World War II, researchers at Bletchley Park in England made a huge computer out of vacuum tubes to crack the supposedly unbreakable codes of the German Enigma machine.
Current supercomputers simulate nuclear explosions or study the airflow over Pringle's potato chips so they won't crumble or fly off the assembly line.
The scope and complexity of supercomputers remains daunting. The Earth Simulator itself occupies a three-story building. The first floor houses power supplies and air conditioners. A truncated second floor contains the lion's share of the cabling. ("We had a very hard time connecting the wires," Watanabe said.)
The third floor contains the computing elements--5,120 custom-made processors, 10 terabytes of memory, 640 terabytes of disk storage and a 1.5 petabyte (1.5 quadrillion-byte) supplemental storage system.
Because of the massive amount of data involved, it can take two to three days to process a model such as an active map of the ocean floor, Watanabe said.
NEC's competitors, though, aren't standing still. A number of supercomputers that are coming in the next few years promise to rival Earth Simulator in performance and to bring new concepts to the forefront.
How? The customized processors in it will depend on asynchronous logic. In current chips, the transistors all react at the same time to the tick of a clock--sort of like the cast of "Riverdance." This leads to excessive energy consumption and the computer equivalent of busywork. In asynchronous chips, circuits only perform when required.
Cray, meanwhile, is trying to break performance barriers while reducing overall costs with Red Storm, a U.S. Department of Energy project that's expected to go live in the second half of 2004. Red Storm is designed to contain 10,368 Opteron processors. One of its unusual features is that the computing nodes contain only one processor rather than the two or more processors supercomputers typically have.
"Multiprocessing nodes were more expensive, so we went with uniprocessor nodes," said Robert Alverson, Red Storm's hardware architect. Chips for different functions are also being combined to reduce costs. Cray is studying how to expand to Red Storm to 30,000 processors.
Over time, supercomputer functionality moves toward the desk. The average Palm has more computing power than most universities had 50 years ago. But it takes years, so don't expect to be ruling the waves anytime soon.
Michael Kanellos is editor at large at CNET News.com, where he covers hardware, research and development, start-ups and the tech industry overseas. He has worked as an attorney, travel writer and sidewalk hawker for a time share resort, among other occupations.