Are you aware of the work being performed by Jack Gallant, a neuroscientist at U.C. Berkeley? Basically he gets a subject to watch a video, and he reconstructs that video by scanning the brain of the person watching the video (while the person is watching it, of course). As you can see from this video, this technology is still in its early days, but -- even so -- it's significantly more advanced than I think most of us would have suspected.
Think about the holodeck on Star Trek? The idea was that users could walk into a special room in which they could have a totally immersive 3D virtual reality experience.
Surely we are going to have to wait a long, long time before we can experience anything like this. Well, maybe not. As you may recall, I recently pledged to the Kickstarter campaign for the forthcoming Obduction immersive reality game. (Click here to see my blog.)
Now, imagine taking a stroll around the virtual Obduction world shown above while wearing an Oculus Rift virtual headset. This little beauty is set to change the gaming industry by providing a truly immersive 3D experience. As you move your head, the scene changes to reflect this movement without any discernible latency or artifacts.
When I say "taking a stroll," I'm not simply talking about pressing forward, back, left, and right buttons on a joystick. No! I'm talking about something like the Omni Treadmill that allows you to actually walk or run around the virtual world of your choice.
How about combining the Omni with the Oculus Rift? Now, I'm not really one for shoot 'em up-type games, but take a look at this video and tell me what you think.
I can really see myself using this technology to wander around the Obduction world. But wait, there's more. First we have gesture recognition and control, which is starting to come online. Existing systems are primarily targeted at controlling your desktop computer. Future incarnations will be able to do so much more, including identifying and recognizing posture, gait, emotions (from facial expressions and other body language cues), and so forth.
There's yet more. A startup company called Ultrahaptics aims to bring the sense of touch to touchless interfaces and -- by extension -- virtual worlds. Based on research performed at the University of Bristol in the UK, this technology uses ultrasound to project sensations through the air and directly onto the users' hands. The end result, as seen in this video, is that the user can feel touchless buttons, obtain feedback from mid-air gestures, and interact with virtual objects in virtual worlds.
Some of these technologies are still in their formative stages. Others are more advanced. What we've discussed here may well be just the tip of the iceberg. I honestly believe that most of us have no idea just how much the world will change in the coming years. Hold onto your hats, because it's going to be an exciting ride!
— Max Maxfield, Editor of All Things Fun & Interesting