As I knew Stephen Hawking was using the limited movement of his cheek detected by an IR sensor or similar device and then translated by a voice synthesizer. This is not a long term solution, as Hawking’s ability to control his facial movement continues to decrease. It is great if he can use this Eye Tracking technology.
I feel more than gaming, this technology is going to help many of the people suffering from similar kind of disabilities.
just now i saw the video and he is playing mario brothers game. How good this will be? will it add strain to the neck and eye.Do we get more pleasure by using this device and many questions like this are storming my mind and time will tell about this
Here the electrical polarity of the eye working as dipole is measured to loacte the position of the eye ball. Probably one need to keep the head position constant to make this application rightly working.
It would be good to know how this can help the disabled to better improve their lives. The future is going to be telepathy and very soon our minds will be read digitally. The homo sapiens will evolve to bionics.
Possibly it was a "liberator". A friend of mine got "locked in syndrome" (see or read "The diving bell and the butterfly) and has one of these devices. His mind is all there, he just could not talk, and now he can with this device. It's a slow process, but it's heaps better than having to use a "translator" who reels through the alphabet and picks letters out when he winks.
I think Stephen Hawking uses something similar as well.
You're right, a very noble application of technology, but these things are not cheap....
It has always amazed me how much of the advances we see today in electronics has been driven by gaming. Think video cards and graphics as one good example. True spatialized 3D audio was showing up in games in the mid-90's well before surround sound techniques showed up in consumer goods. As I remember, it was a Star Wars game.
And the list can get quite long before you can say gaming had nothing to do with it.
Some years ago, I became aware of a project in Palo Alto using eye and muscle control for the seriously physically disabled. Using equipment that would be laughed at today, patients were able to "type" on a projected computer keyboard and do other computer related tasks, using eye movements only.
They were also able to control MIDI music synthesizers with very slight muscle movements, which was primarily used as a biofeedback mechanism to help retrain muscle nerve paths and forge new ones. Some patients actually created some simple compositions and were able to play them repeatedly, as well as expanding their abilities as their muscle movements got stronger.
I don't know how the project has progressed since, but it always struck me as a noble application of technology. I hope this "new" technology can be quickly applied to those whose life would be improved if they had it...not just gamers. No disrespect to the gamers.
Thanks for this article. This is a very interesting topic and concept as it applies to gaming. I am very interested in how this will impact the gaming curriculum at institutions of higher education around the Country. I hope there will be a follow-up to this article R. Colin Johnson.
Game control just gets more interesting everyday. Previously, Microsoft just reveals the next generation game control. Now, we could control the Avatar through our eyes. Will eyes, hands and legs be used to control the Avatar in the future?
The application area of eye control is definitely huge. Not only will it improve gaming experience but also serve as an alternative way to input. I can't wait to see the first game that uses this technology.
It seems that ideally there should be a way to decouple the control from eye movment to enable responses to "hey you" without causing Mario to jump into a hole... Perhaps only working when the movement occurs during a wink?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.