There's a lot of very interesting stuff going on with regard to direct human-machine interfaces -- I really don't think it will be long (in the scheme of things) before people will be able to control things (including capturing and displaying words) just by thinking about what they want to say...
There was a project in a company in Palo Alto a few years ago that dealt with disabled patient's and using technology to improve their communication as well as mental attitude. Sorry I don't remember which company.
But one project was also an eye-tracker which worked with a large projected QWERTY keyboard image, and the eye tracker could note the character position to a very high degree of accuracy.
As I recall a similar project allowed people to play music synthesizers by both eye movement and whatever physical movements they still had. An early precursor to Xbox Kinetic :-).
In all cases, the patient's mental attitude greatly improved as the positive reinforcement of the activities (and resultant achievements) kicked in.
Kinect review http://www.boxreview.org/ also makes note of the voice recognition feature that is not well implemented in most games.
Under $80 at resell sites, make it a cool device worth doing more research.
With XBOX and Kinetic add-on in the less-than $300 price range, seems like there is room for a design idea begging to be explored.
I wonder if there is a low-cost similar interface device, that could track eye movement, rather than limb motion.
Close to my heart as well Max. A friend of mine has "Locked-in Syndrome" due to a stroke - see
This happened to him on a remote island. When I first visited him when I got back to South Africa, his nurses would communicate with him using a very similar technique to that above. Communication was extremely laborious and slow.
For some time now he's had a machine called a Liberator, which works on eye movements much the same as the Eyewriter you refer to. They are horrendously expensive (think $ 30K I seem to remember) so any plans to reduce this and make such equipment more accessible to people with conditions like this would be good.
Thinking about Steve brings tears to my eyes. When he was first diagnosed with ALS, he built an electronic device that measured the speed of signals passing through the nerves in his arm.
Later he designed a hoist system on a track that his caregivers could use to lift him out of bed and transport him from the bedroom into the bathroom.
He also loved science fiction -- this was not one of his wife's interests, so I would borrow their special wheelchair-friendly van and he and I would go to the movies together once a week ... he was constantly coming up with new improvements for his wheelchair (grin)
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by