SAN FRANCISCO – The next version of Android will enable using graphics cores for computational photography and infrared links for TV remote controls. Meanwhile, mobile developers need better tools for harnessing the power of the graphics cores in the works for 2014 and beyond, said an Android developer.
“Only this year have mobile GPUs gotten powerful enough to do something beyond render a screen so you can do computation with them using [Google’s] Renderscript,” said Dave Burke, engineering director of the Android team in a talk at Google I/O. “The [smartphone] camera can evolve—there’s so much more you can do in hardware and software, too,” he said.
Enabling computational photography on graphics cores is one of the priorities for Google’s Android team. Demonstrations of automated photo enhancements on the Google+ social networking site were based on filters running on GPUs, said a Google developer.
Chip makers such as Nvidia and Qualcomm are hard at work delivering computational photography features on their graphics cores. Much of the work is initially focused on using GPUs to enhance the often poor quality of mobile images, such as balancing over- or under-developed pictures.
Google will pave a software path for such efforts in the next iteration of its mobile camera API. Camera 3.0 will expect smartphones and tablets use a burst mode, shooting several images in quick succession. Graphics cores will then apply computational photography techniques to deliver the best images.
Google’s Camera 3.0 spec will be released later this year. It will also provide support for 3-D depth data, said a Google Android developer.
A number of impressive mobile graphics cores are coming to market over the next year, said the Google Android developer, expressing some concerns about the immaturity of the tools for harnessing them. The developer described current tools for handling threads and asynchronous tasks as “a glorified thread pool.”
Renderscript remains Google’s tool of choice for general purpose parallel processing on graphics cores. It was used by the Android team to implement an ARM Mali 4 on the Nexus 10. By contrast, OpenCL lacks platform independence with its performance varying greatly across different chip sets, the developer said.
Yep, I was thinking about how my old Windows phones had such nice remote control software for my living room devices but the new phones do not... However, more and more control is available via wifi...
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.