WASHINGTON – Chip makers need to advance image recognition, display and MEMS technologies to deliver the next-generation of immersive gaming applications built around interactivity and augmented reality, the CTO of Playstation developer Sony Computer Entertainment told engineers here on Tuesday (Dec. 6).
“User interface experiences beyond the five senses are needed,” Masaaki Tsuruta, who also serves as the Sony unit’s executive vice president for technology platforms, told the International Electron Devices Meeting.
The delivery of outputs from “super high-resolution displays” rendering 3-D graphics and other immersive features also will require higher and wider bandwidth performance. “The next bottleneck [for gaming] is bandwidth,” Tsuruta predicted. Data rates in the range of 10 Gbit/sec will be needed soon, he added.
The Sony executive said he expects chip makers to make greater use of design approaches like through-silicon vias and through-chip interfaces to speed connections among image and motion sensors that are becoming increasingly standard features of gaming consoles and augmented reality devices.
As consumer electronics companies like Sony seek more realistic gaming, they are also moving to next-generation motion sensors with 10 or more axes. The latest addition to what has become the standard 9–axis sensor is a pressure sensor. The next step will be improved integration of these motion sensors, Tsuruta said.
Interactive game makers are also stressing concepts once associated with advanced simulators and military applications like “sensor fusion.” The point of this development work is to “get closer to human motion and perception,” Tsuruta explained. Augmented reality applications are a “very effective example of sensor fusion” in gaming, he added.
Emerging multifunction devices will require sensor fusion to merge attributes like motion sensors that could eventually incorporate new capabilities like magnetometers, Tsuruta said.
With the explosion of touch panels and mobile devices, Sony developers foresee future user interfaces that leverage image sensor technologies that can deliver higher frame rates (above 300 frames per second) and higher dynamic range. The result will be improved graphics that “capture the real world and render the virtual world,” Tsuruta said during an IEDM keynote address.
The consumer electronics giant also is aiming for the “ultimate immersive system” that integrates features like eye-tracking sensors and haptic interfaces beyond the standard vibration devices installed on current game controllers to provide tactile feedback. They also want to use flexible displays that enhance immersion. Eventually, these systems could be linked via a mobile cloud client, Tsuruta forecast.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.