SAN JOSE, Calif. The vision of cloud computing is simple. Just as people reach out to the Internet today to search for data, in the future they will tap into the Net's servers to run their apps.
Some skewer cloud computing as marketing hype. Others go so far as to predict that in the next century most companies will ditch their in-house computers and tech staff, just as they stopped paying for systems and specialists to generate their own electricity in the last century.
Making the vision a reality will require years of work, much of it in creating new approaches to software, said panelists at an annual research event at the headquarters of Internet auctioneer eBay Inc. here. The new tools needed range from parallel programming primitives to authentication standards and application programming interfaces.
"The software side of this effort is nothing like the metaphor of the electrical business," said David P. Young, chief executive and founder of Joyent (Sausalito, Calif.), a startup offering cloud computing services. "Software scaling is poor, and this is a problem that needs to be solved," he added.
Young cited progress in some areas. He said Joyent will support the Eucalyptus project at the University of California at Santa Barbara, an open source version of the EC2 API used by Amazon.com for its cloud computing services.
"I think Amazon has won, and its EC2 will become the x86 chipset of cloud computing," Young said.
The next big step will be to create similar standards for security and metadata services that will let computers authenticate and describe jobs to systems used by competing vendors. "I am sure everyone is working on it, but not in an open way," he said.
Panelists agreed that finding ways to tap into the increasingly parallel processing resources inside Internet data centers will be one of the keys to cloud computing. That's still a long way off, according to one audience member.
"We're still writing serial programs because parallel primitives are lousy. We need new programming models and primitives," he told the panel.
"There are projects at HP, Yahoo and elsewhere addressing that," said Prith Banerjee, vice president of research at Hewlett-Packard Labs and a member of the panel.
Scatter/gather computing techniques such as Google's MapReduce and the open source Hadoop "are steps in that direction but they can be applied to a limited set of applications. We are working on other approaches," Banerjee said.
The research director outlined a dozen projects in cloud computing now in progress at HP Labs. "Once you assume people will move away from building their own infrastructure there are a whole set of game-changing technologies you can work on," he said.
"The HPs and IBMs of the world would love to maintain the status quo and keep selling systems, however we see the disruption coming and we know if we don't embrace it a bunch of startups will eat our lunch," Banerjee added.
Luke Hughes, director of research at Accenture said the greater availability of parallel hardware and the rise of many data-parallel applications will spark next-generation efforts in parallel tools. The consulting firm is focused on creating tools to help migrate existing systems to cloud computing, he added.
"There is still work a lot of to be done," said Anne Hardy, vice president of platforms at SAP AG. "The winners will understand how to leverage many-core architectures," she added.
Neel Sundaresan, director of the 30-person eBay Research Labs and host of the event, talked about his R&D priorities and his thoughts on cloud computing in an interview before the panel session.