September 1, 2000 1:10 PM PDT
Buddy, can you spare some processing time?
- Related Stories
Conoco hopes to hit oil with slick supercomputerAugust 30, 2000
Intel execs: Napster-like sharing will transform businessesAugust 24, 2000
Companies are springing up with business models that involve farming out computing tasks to hundreds or thousands of computers on the Internet or private networks. Biotechnology and financial companies already are showing an interest in the technique, called "distributed computing."
Distributed computing isn't a new idea, as shown by the popularity of SETI@home, a screensaver program that processes radio telescope signals to search for extraterrestrial communications.
Now the profit motive is entering the picture:
Parabon Computation, a 45-person company in Fairfax, Va., will begin paying people this fall to let their computers work on problems for biotechnology, financial and pharmacology research, said chief executive Steve Armentrout, who in his last job ran computer simulations for financial companies. The pay should be enough to allow average people to trim their Internet access charges, he said.
United Devices, founded by SETI@home founder David Anderson, has secured $13 million in venture capital funding from Softbank Venture Capital, Oak Investment Partners and others.
Entropia has secured $7 million in funding from Mission Ventures and Silicon Valley Bank's San Diego Technology Group.
Popular Power hasn't started paying people yet but already has embarked upon researching HIV, the virus that leads to AIDS. Brian Behlendorf, a co-founder of the open-source Apache Web server project, is a seed investor in the company.
Applied MetaComputing has government customers such as NASA and the Defense Department as well as clients from the Fortune 500.
Finally, TurboLinux sells a product called EnFuzion that lets corporations farm out computing tasks across their computers using any operating system. Customers using EnFuzion so far include financial firms J.P. Morgan and AMP Assets, Procter & Gamble for computer-aided design, and Motorola for chip design, said TurboLinux spokesman Jacob Webb.
All these companies see an opportunity because of two basic facts: Computer CPUs spend a lot of time idle, and the Internet has provided the means to harness that untapped power. Indeed, Intel last week advocated distributed computing as one justification for buying its upcoming Pentium 4 chips.
Intel said it saved $500 million during the past decade using distributed computing instead of large central computers to simulate and validate chip designs.
"We are consistently accelerating our chip schedule because of these mammoth computing resources," Intel vice president Pat Gelsinger said at an Intel conference last week. "We sped up validation on (our) latest chip by eight weeks."
But there are hurdles to this collective computing technique. For one, companies will have to entrust their data and calculations to outsiders.
"People aren't just going to put their $30 billion system on the Net and say, 'Here, have fun,'" Andrew Grimshaw, president of Applied MetaComputing, said at the Intel conference.
But distributed computing works on private networks as well.
Not for everyone
Not all computing tasks are amenable to the distributed technique, however. For example, Conoco evaluated but rejected distributed computing for reconstructing 3-D maps of the Earth's interior, said Alan Huffman, manager of Conoco's seismic imaging technology center.
Distributed computing isn't a good idea in this case because so much data needs to be digested, and distributed computing often lacks bandwidth to transfer information to and from the computers doing the actual calculations. As a result, Conoco built its own supercomputer instead.
Another hitch with distributed computing is installing the calculation software on the myriad machines participating. It's hard enough to get people to install software, but it's even harder to get them to swap it for new software when a company wants to change from, say, designing a chip to analyzing the derivatives market.
Parabon gets around this by installing only one piece of software, a generic computing engine called Frontier that is capable of performing a variety of tasks. The company has a Windows version of the software and will release a Linux version next month and Mac and Unix versions later. People donating CPU time get paid based on how much calculating their computers accomplish.
One sponsor so far is pharmaceutical company SmithKline Beecham, which is funding University of West Virginia research into connections between chemotherapy drugs and patient lifestyle choices such as rest and exercise. "They shelved the problem until we came around," Armentrout said.
Another project Parabon soon will launch is for the University of Maryland, which plans to use Parabon's network for analyzing protein folding--a key part of the link between genes and biochemistry. IBM has launched a $100 million, multiyear effort called "Blue Gene" to design a single centralized computer for similar research.
"People in organizations need burst power, but computers deliver sustained power. What an intermediary can do is help bridge that gap," Armentrout said. "We're giving clients the ability to buy variable power on demand."
1 commentJoin the conversation! Add your comment