Graphics are crucial both for next-generation supercomputers and smartphones, said Dally, taking a swipe at his newest competitor, Intel’s Xeon Phi. “The real challenges for the next five to ten years [for supercomputers] are equally divided between energy efficiency and programmability,” Dally said.
To get to tomorrow’s exascale systems, chips need to slim down from about 100 picojoules per flop today to about 20, and they need to migrate from programming millions to billions of nodes, he said. Nvidia’s graphics are now used in about 50 of the world's top supers, thanks in part to the maturity of Cuda.
Intel is quickly starting to get design wins of its own in supercomputers for its Xeon Pi, a co-processor made up of an array of x86 cores. Dally said the chip lacks “a day job” as a viable graphics processor to provide the volumes needed to support its road map. He also criticized Phi as lacking the energy efficiency of Nvidia’s graphics cores and being based on a Pentium-era x86 core.
“I would worry about the long-term viability of Phi if I was a supercomputer designer,” he said.
In terms of investments, “China has a road map to get to exascale before anyone else and it’s putting piles of money on it,” Dally said. “Europe’s exascale program has not been scaled back” despite its economic woes, but funding for the U.S. initiative “is getting pushed back,” he said.
In handsets, graphics are now being used for computational photography, a laundry list of techniques for making consumer snapshots look good. Nvidia and its competitors are rolling chips that support high dynamic range, averaging out poor lighting conditions and blurs that dog amateur shutterbugs.
“The ultimate goal is to make the average person an expert photographer,” he said. “We have a pile of stuff we are working on--computer vision in general is a major area of focus a tremendous number of apps for cameras in and outside the car, for example,” he added.
Q&A: Nvidia’s Dally on 3-D ICs, China, cloud computing
GloFo, TSMC report process tech progress
Slideshow: Nvidia Pictures Graphics Future