LONDON--Augmented Reality (AR) is the process of layering computer generated graphics on top of real world environments, either directly or indirectly through a tablet or cell phone. The technology provides a live direct or indirect view of a physical real-world environment whose elements are augmented by virtual computer-generated imagery, or in other words, it’s what the world would look like with the Internet layered on top of it.
AR provides an extra layer of viewable, clickable and searchable data on top of either video or still life, through the camera lens and screens of consumer hand held devices, effectively blurring the lines between the real and the imagined.
Meant to create an immersive experience, the technology can even be taken so far as to add elements like graphics, sounds, haptic feedback and even in some cases smell as an added layer. As a result, the technology has found applications across a wide range of industries, from military to tourism, to gaming and education.
Being able to superimpose graphics, audio and other sensory data over real life scenes has enabled developers to come up with all kinds of applications, from layered maps, to point-and-read restaurant reviews, historical trivia much more besides.
At MWC 2012, EE Times tracked down some of the leaders in the AR space to ask them what they saw in Augmented Reality and how they saw the industry progressing.
Check out the video below for more:
And here's another video we shot at MWC on Aurasmo's use of the technology:
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.