BERKELEY, Calif. A veteran computer science researcher described plans for defining a new model for parallel programming at an annual gathering at the University of California at Berkeley here Thursday (Feb. 21). In separate panel discussions other researchers sketched out issues they are pursuing that could lead to the era of ubiquitous computing.
David A. Patterson, professor of computer science at Berkeley, said the opportunity to define a parallel programming model for mainstream computing was "an opportunity that comes only once in your career."
Every startup that has tackled the problem over the last forty years has failed. Nevertheless, "The whole IT industry has bet its future on figuring out the problem of parallel programming, and I am still astonished about that," he said.
The challenge comes in the face of microprocessor designers hitting a "power wall" that made it impossible to continue to build faster more complex CPUs. Instead, they have opted to build multi-core chips that force software developers to harness their parallel resources if code will continue to run faster on future generations of hardware.
"No one knows how to design a 15 GHz processor, so the other option is to re-train all the software developers," Patterson said.
In pursuit of that ambitious goal, Patterson sketched out the working plan for a new Parallel Computing Lab he will direct at the university.
Sources said Intel and Microsoft chose Berkeley for a $10 million, five year grant for work on the problem. Patterson confirmed some 25 universities competed for a grant from the companies, but said details about who won the grant will not be revealed until March 19.
The new lab will take a five-step approach to tackling the problem.
First, it has selected researchers in areas such as image recognition, voice recognition and personal health to define compelling parallel applications. Researchers will then study those apps to identify common patterns. They have already identified 13 basic programming jobs at the heart of most parallel programs.
A separate group will develop frameworks and a composing language to help programmers create and coordinate parallel programming modules. Other groups will define new operating system and hardware architectures to best fit a parallel model as well as tools to help programmers optimize their applications.
One of those tools, a so-called auto-tuner, aims to use complex heuristic search techniques to find the best data structures for a given application on a specific processor. It would replace a traditional compiler.
"The concept of an auto-tuner is one of our big bets," said Patterson, who helped develop on of the first RISC processors earlier in his career.
Patterson said the Parallel Computing Lab will involve about a dozen faculty members and 40 graduate students. He is expected to lay out initial milestones for the work in about a month.
The stakes are high. If researchers cannot define efficient parallel techniques "programming will be so difficult that people will not get any benefit from new hardware," and if that is the case "we will go from being a growth industry to being a replacement industry," Patterson said.
In separate panel discussions at the Berkeley event, several researchers said they are focused on the transition to an era of ubiquitous computing.
"We need more automated systems to track things and tell us when something goes wrong," said Ken Goldberg, a professor of industrial engineering at Berkeley, describing work with Bayer on a system that tracks pharmaceuticals from production through use.
"The IT industry we know is focused on one-sixth of the population, but there is a tremendous untapped opportunity in the sheer volume of the people who could be involved," said Eric Brewer, a Berkeley computer science professor and founder of Inktomi who described several projects he is working on in developing countries.
"The cellphone is the appropriate platform for those countries but it is not yet being used as a computer," said Brewer who talked about health and sanitation projects using sensor networks and long-distance Wi-Fi links in remote areas.
Energy and the environmental issues are also driving work in ubiquitous computing, said S. Shankar Sastry, dean of engineering at Berkeley.
"We need to think about a building OS that handles all the heating and cooling systems and controls elevators," he said, describing work that could make these large energy consumers into generators. "We need to create buildings that not only consume zero net energy but have zero net cost," he added.
Brewer said an overhaul of electric utilities is needed to support a shift from alternating to the more efficient direct current approach.
"One reason we are not doing this is there are not enough power electronics engineers in the U.S. anymore, so something like this will probably come out of China or India," he said.