A movement to build an overlay network called PlanetLab atop the Internet is mushrooming in the same heady collaborative spirit that propelled the open-source Linux operating system. At the same time that PlanetLab is thinking big, another movement is burrowing to ground level, attempting to extend the Internet downward into a fine-grained, ubiquitous network of sensors and actuators.
Since the dot-com bust and the wider economic slowdown, there has been much speculation about a new killer app to reignite the technology sector's fortunes. A conference on emerging technologies held recently at MIT suggested that elusive agent may be hiding in plain sight in the form of the Internet. Intel Corp., which is especially active in this pursuit, described a two-pronged plan to retrofit the Internet by means of PlanetLab and a network composed of miniature hardware called Motes. If they come to fruition, the two ideas could radically reshape both the electronics industry and consumer products.
A year ago, Intel donated 100 machines to set up a network of nodes that researchers could use as a testbed to guide the architecture of the overlay network. There are 160 machines at 65 sites in 16 countries, supporting 70 PlanetLab research projects in such areas as distributed storage, peer-to-peer systems and network mapping. In June, Hewlett-Packard Co. announced it would contribute to the project.
PlanetLab seeks to re-create the Internet in the form of a distributed, planetwide parallel processor. Users would be allocated a slice of the system on which to perform any computer function now residing on their desktop or portable PC. Everything except the modem and interface would be swept away by this system, which would let users collaborate in much more tightly coupled work groups than is possible now. All software and data would be on the Internet, instantly available for interaction. Hardware, software and data incompatibilities that now plague collaborative research would be eliminated.
The idea of adapting RFID technology to disperse low-cost wireless sensor and actuator networks throughout the natural or urban environments, meanwhile, would provide a platform for a range of new applications, Intel said. Environmental monitoring of contaminants, biological research into fauna populations, and agricultural and building-integrity monitoring are a few of the possibilities. Indeed, one project is already monitoring waterfowl in Maine.
The combination of PlanetLab and the new, fine-grained Internet based on distributed sensors would propel technology into a fundamentally new phase, said Intel's director of research, David Tennenhouse, who summarized computing's arc at the MIT conference. "With PlanetLab, you would have fingers of your program strung around the world-millions of users simultaneously using the same distributed system, all with millisecond response times to your program," he said. "Alternatively, you could have fine-grained sensors and actuators strung around the world, all within milliseconds of your program."
Either system would offer unprecedented real-time information processing at a fine-grained scale that would strain the current interactive-computer paradigm to the breaking point. So the new Internet will force a basic rethinking of computers, how they are used and what their ultimate role in society will be. Tennenhouse even sees the CPU taking on such functions as learning and speculation.
In this vision, the next-generation distributed computer would take in vast quantities of real-time information, compare it with past experience and offer the user probabilistic suggestions on its meaning and on where the user should direct investigations. The paradigm would represent a fundamental shift for computer-savvy researchers and casual users alike. Accustomed to a deterministic machine, both experts and nonexperts could find the new, stochastic model a challenge, Tennenhouse said.
The initial push for PlanetLab is to have around 1,000 nodes connected to all of the Internet's regional and long-haul backbones. The PlanetLab consortium is encouraging institutions and individuals to join in the development process. Each participant gets a Linux-based software package that performs such basic functions as bootstrapping nodes and managing user accounts. The node software implements a distributed virtual computing environment, called a "slice," which controls networkwide hardware resources. The virtual environment is able to run on machines distributed across the network, so that the user's individual node is just a small part of the available computer resource.
At the other end of the scale, a major thrust in the area of fine-grained sensor networks has come from Intel's Berkeley Lab, one of a series of academic labs the company sponsors. The concept was to shrink sensor technology to a tiny form factor at centimeter dimensions, while enabling them with a wireless ad hoc network capability.
Called Motes, the hardware has progressed from small pc board prototypes two years ago to the recent "Spec" chip, with a total area of 5 square millimeters, developed at JLH Labs (Capistrano Beach, Calif.). Spec incorporates a CPU, data and program memory, and an RF transmitter and will cost around 25 cents, while using very little power, said Jason Hill, chief technical officer at JLH. Some applications, such as environmental monitoring, will require nodes that can operate for years without running out of juice.
To date, 300 companies have designed Motes into products at a level of more than 5,000 pieces, Hill said.
The Mote concept also involves a special operating system, called TinyOS, and a database system to organize the data collected by multiple sensors. The idea is that Motes can be distributed widely and will spontaneously establish a node-hopping network and organize the incoming data for a basestation.
Lakshman Krishnamurthy from Intel's Network Applications Lab gave an account of where Intel itself thinks this technology will be applied. The company is sponsoring a number of research projects into possible uses.
One project that is available to the public is a network of visual and sound sensors on Great Duck Island, off the coast of Maine. The network allows monitoring of the island's bird populations in real- time. The Web site's simple interface lets a researcher set up a monitoring scheme using the sensors. The time period for data collection, the type of data and the specific Motes used can be set via the Internet, and the data would then be collected and presented to the user.
A power outage caused by Hurricane Isabel took down the system temporarily-a problem that might signal one of the vulnerabilities of the Mote concept when deployed in the environment. And of course, such networks are also vulnerable to malicious tampering by humans.
In terms of industrial uses, Krishnamurthy cited a semiconductor fab-monitoring application that could save a chip maker big bucks. "If you can predict when equipment is going to fail, you can save millions-maybe hundreds of millions-of dollars," he said. Vibration sensors are placed around a fab line to look for any variations that might signal a problem. The drawback of the current system is that a huge amount of data comes into a central computer and must be analyzed; it is easy to miss a critical signal showing where trouble is developing.
Intel is working with a number of customers to set up a system in which Motes equipped with vibration sensors could be distributed throughout the fab line. With the TinyOS and TinyDB database protocols, the wireless network could sift through data, sending the significant variations to a central monitor. By identifying and heading off problems at an earlier point in a possible breakdown, the system could make fabs more reliable, and therefore less costly to operate, the company believes.
Even if successful, these new extensions to the Internet may not foment a revolution but simply take their place alongside many other developments that technologists would like to network. Linux provides a model for that scenario, underscoring definite success for a new approach but also the considerable power of entrenched operating systems.
Perhaps it is significant that the PlanetLab initiative uses Linux as its operating system. Potentially, the two might prove an unstoppable combination, sweeping away Microsoft's domination in operating systems and perhaps one day even rendering the PC, Macintosh and various workstations obsolete.
And there will be legal and social battles as well. The Internet has already amplified the abilities of information technology to invade personal privacy, and as Motes start showing up everywhere-from traffic intersections to grocery stores and hospitals-a new level of invasive information gathering will be available to interest groups ranging from corporate marketers to the Justice Department.