Long, thin boards with a 3D sensing device, special processor, and multiple cameras were embedded in tablets and laptops to map and monitor touch. The RealSense sensor has a depth map and can monitor 78 points on the face and 22 points on the hand.
RealSense's facial and hand tracking.
The points could be used for more accurate body modeling.
3D body mapping using RealSense embedded in a tablet. A scan could be 3D printed for your very own action figures.
They also could be used for improved motion control in gaming.
Using the same technology, Intel was able to project images from a computer into a 3D touch-capable display. Officials said such technology could be used in kiosks and immersive games.
I like several of the devices and think that they will be very important, but I don't really see Intel as being the prime driver on them. It is like Microsoft claiming credit for documents written using MS Word. I can certainly acknowledge Intel for pushing overall computing capability as an enabler, but this seems like trying to hog the limelight with a list of sexy applications.
@Jessica: Yes, I would say Intel is on the right track. Of the list you posted, I am a fan of self-driving cars and of connected vehicles. Intel may be on to something here and I would imagine it is actively working with automobile companies to get its next 'Intel Inside" logo on cars!
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.