August 27, 2001 5:10 PM PDT
Intel delves into pervasive computing
The Santa Clara, Calif.-based chip giant will increasingly focus its research on "proactive computing," or the creation of embedded mini-computers that obtain sensory data from the physical world and shuttle it across networks, David Tennenhouse, vice president and director of Intel Research, said during a speech Monday at the Intel Developer Forum here.
The heart of these networks will be microelectromechanical systems (MEMS), which are tiny computers with self-aware networking and, in many cases, independent storage. MEMS already exist in antilock brakes and air bags. In the future, however, MEMS will be attached to people to monitor skin lesions or inserted in clothing to track people in case they get lost. Sensors dropped on a forest fire will be able to form an ad-hoc network and provide data about where the fire is burning the most fiercely.
"We are working toward the point where computers are acting in advance and anticipating our needs," he said.
The project won't be an in-house effort. The company has kicked off a project to create branches of Intel Research, the company's R&D unit, at engineering universities. Earlier this summer, the company opened a research lab in conjunction with the University of Washington to study so-called ubiquitous computing.
A branch for studying "extremely" networked systems, or networks containing numerous nodes that stretch over small and large geographic areas, has just started at the University of California at Berkeley. In September, another branch of the lab will be set up at Carnegie Mellon University in Pittsburgh to study widely distributed storage, according to sources at the company.
Another five to eight satellite labs will be set up next year, sources said. Intel spends about $50 million a year on university research but had not previously set up satellite labs of this sort.
The driving force behind MEMS lies in information overload, said Tennenhouse. Simply put, the amount of data is far outstripping people's ability to manage it.
"Not only are we the input/output devices for (computers and handhelds), we are the chauffeurs," he said, adding that machines "either work for us or we work for them."
Micromachines will rein in the data flood by being able to directly gather information from the physical world and deliver it in real time when the data is wanted. Humans won't be needed for data input.
Conceivably, the micromachines can be placed in any environment. The University of Washington, for instance, is planting networked sensors along a Juan de Fuca tectonic plate in the northern Pacific Ocean to study earth movements as part of the Neptune project. And NASA's Jet Propulsion Laboratory put four nodes on the surface of Mars.
Another advantage of micromachines is that they do not have to be on constantly, which would also lead to a flood of data. Although civil engineers might embed sensors into a roadway, researchers don't need to activate them into ad-hoc networks until, for example, an earthquake strikes.
Although these machines perform functions similar to what ordinary network nodes do now, Tennenhouse predicts that how they are used will alter how researchers think of computing. Currently, computing is largely deterministic: Data goes in, and an exact answer comes out.
In the future, statistical probability will become increasingly important. Machines will anticipate the type of data that a person will require. Some search engines have already started to walk down this path by pre-caching links that the reader will likely hit depending on the query, thereby cutting down access time.
In addition, people will have less control over the data being input because it will come from sensors in the wild. In other words, answers will no longer be exact, but likely.
"We are going to start using statistical methods much more," Tennenhouse said. "Computer scientists may be on the verge of what physicists went through in the '20s."
David Culler, a professor at U.C. Berkeley, said there is still a lot to learn.
"We 're only beginning to understand how these low-power networks work," Culler said during a speech at the developer forum. "It is really important that the network assemble itself." On Monday, Culler and a group of students created an 800-seat ad-hoc network by having forum attendees help activate sensors under their chairs.
Financially, the growth of these devices could be a boon for Intel and the PC industry. The sensor market will generate growth. But more importantly for the industry, the data explosion will spur demand for servers and networking equipment.
The traffic created by these devices, Tennenhouse said, could create a "100-times increase above and beyond the growth of the Internet today."