A Qualcomm graphics expert gave insight into work on mobile virtual reality in Android that is about to emerge from the lab.
Hitting the 20 millisecond latency target
SAN JOSE, Calif.—Google’s Daydream VR is very much a reality today for semiconductor managers such as Tim Leland. The head of visual processing at Qualcomm is one of many working with the search giant for some time to bring to all next-generation Android phones an upgraded version of the mobile virtual reality Samsung pioneered with its GearVR.
Leland’s team helped develop an Android framework for optimizing single-buffer rendering. The graphics cores in its latest Snapdragon 820 SoC were tuned to deliver fine grained pre-emption to reduce motion-to-photon latency, a key metric to make sure displays change as fast as a user’s head moves.
“It took a lot of effort” from deep in the SoC through Android to the application to hit the 20 millisecond target, Leland said.
Snapdragon chips needed “to change way they handshake with sensors” to reduce latency. The sensors themselves need to support fast sampling at rates of 100 MHz to a gigahertz.
Qualcomm developed an algorithm it calls visual inertial odometry to track head motion across six-degrees of freedom. It correlates on Snapdragon’s embedded Hexagon DSPs data from a handset’s accelerometer, gyroscope, magnetometer and cameras.
Developers will be able to access the Qualcomm technique in an SDK the company will release soon. Google also plans to handle sensor fusion tasks in Android N, presumably for handset makers using SoCs that don’t sport their own sensor fusion capabilities.
In a white paper, Qualcomm claims its Snapdragon 820 has less than 18 ms motion-to-photon latency. "To put this challenge in perspective," it says, "a display running at 60 Hz is updated every 17 ms, and a display running at 90 Hz is updated every 11 ms."
Most Daydream headsets will be passive smartphone containers like Samsung's GearVR. (Images: Google)
Handsets typically will need AMOLED displays. They support faster switching times than conventional LCDs which can show ghosted images.
Graphics cores will use a host of tricks to render images to give users a fluid sense of motion while minimizing battery drain. For example, devices will reuse macro-blocks as often as possible to reduce the need to render images.
A simple technique is to render images in the center of a display first, assuming this is where the user is focused. A more advanced approach will use the smartphone’s camera to track eye movement to determine what portions of an image to render first.
Compression of graphics data has become a focus for reducing power while increasing processing speed. Another trick involves changing the tones in an image to make it appear brighter without needing to crank up the power-hungry backlight.
It’s a challenge given today’s phones get hot to the touch just tracking driving directions in a car. Phones this fall will be asked to do more while riding next to the user’s face. The hope is VR will inject excitement in a premium handset market that has slowed.
Next page: Expectations for first and future products
Expectations for first and future products
Most vendors are expected to ship Daydream products similar to Samsung’s GearVR, passive headsets powered by a user’s smartphone
“That’s the typical head device for Daydream, but there also will be purpose-built mobile VR headsets from a variety of partners, and we also have partners working on augmented reality products, but AR comes later,” said Leland.
Google’s VR chief Clay Bavor noted his team shares office space with Google’s Project Tango, its longstanding AR effort.
The dedicated mobile VR headsets may span a range including premium products with more expensive displays, more storage and advanced spatial audio. The headsets are expected to have a wider range of prices, too.
The last piece of the hardware is a two-button controller with an embedded trackpad, similar to the controller on the original Nintendo Wii. An app button is for developers, a home button is reserved for use by Android and overall accuracy is as sharp as a laser pointer, said Google’s Bevor, a former game developer familiar with the power of interactivity.
Daydream's home screen is sparse on graphics, focusing attention on titles.
The products shipping this fall are just a beginning.
“There will be continual drive for lower motion-to-photon latency—that’s something you will hear for the next several years because 20 milliseconds latency is good, but 15 is better and less than 10 is even better—it makes VR look more real, you don’t see the scene trying to catch up with you,” said Leland.
The requirements “will affect how sensor processing is done, how camera subsystems are developed and how graphics rendering works with display interfaces—it ripples through all the processors,” he said.
Along the way, Qualcomm is looking for new SoC blocks that make enough difference OEMs will pay extra for them. But for now, Daydream is well on the way to the retail shelf in time for the holidays.
For its part, Google has primed all the content pumps for Daydream. It is making Street View in Google Maps and YouTube ready for viewing 360-degree pictures and videos, including support in YouTube for spatial audio and motion-intensity ratings.
The search giant is also reaching out to Hollywood and media. It is developing 360-degree 16-camera devices with Imax and others, and has VR partnerships with Paramount, Discovery and The New York Times.
A Daydream developers’ kit is available that gives C++ programmers access to underlying APIs and hardware. Google’s own Daydream Labs at its headquarters has been trying to rapid prototype two VR experimental applications a day and will start sharing what its learning by posting blogs.
“VR on Android will be a surprisingly good experience for those who think you need a PC tethered to a headset,” Leland said.
Users will have to wait a few months to see if Apple agrees and has its own plans for the iPhone 7. Or perhaps it sees an alternate reality.
— Rick Merritt, Silicon Valley Bureau Chief, EE Times