It seems that fictions are going to be turned in to realities very soon, if this researches results fruitful. Cognitive Computing will be very good but it will be really very difficult for the debuggers to find the problems in the systems once designed.
As long as it is only measuring and monitoring of brain activity - it is OK. But the next step could be to modify the behavior of the neurons and that could lead to some disastrous implications and hideous tools in the hands of criminal or crime investigating minds
The researcher do, in fact, plan to stimulate neurons with their probe in order to fully characterize their electronic "transfer function" which will be valuable in understanding how diseases like Parkinson's causes brains to malfunction. Of course, any tool can be misused, but these experiments will be conducted on animals. The only mention made by the researchers regarding use on humans was using the probe during brain surgery to determine just which neurons are diseased, instead of guessing or using trial-and-error like they are often forced to do today.
The probing and monitoring are no doubt useful for neurosurgery - just like the article said, to eliminate trial and error of doing surgery. In the near future, an artificial limb directly connected to neurons will be achieved. Will there be a chance that doping can be done by firing signal to certain neuron to boost endurance and strength? Will hybrid human, half robot and half human, be achieved in the foreseeable future? When human has totally understanding of human brain, the AI will have a big lap. Watch Out! The terminator is coming. ;)
One use your allude to, but don't mention directly, is rewiring neurons to control prosthetics. Many neural interfaces are being tried today, but there is too much trial-and-error involved. However, by using the probes to determine exactly which neurons are still functioning normally, it should be possible to implant electrodes that give amputees control over electronic prosthetics that completely restore a patient's lost functionality. Yes, these humans would be "half robot," but much to the patient's benefit.
What these robotic tools will enable is the precise characterization of how each type of neuron works while it is performing specific tasks. The most important of these is learning. There are thousands of different types of neurons and hundreds of neurotransmitter chemicals handing-off messages among neurons of the brain, yet until now it has been impossible to determine just which are dong what, when. Because the patient is still alive while the robotic probe is in place, it will now be possible to see which neurons are involved in learning, motor control and when processing all the various sensor data streams coming in from eyes, ears, nose, tongue and skin. The second important enabler here is the ability to monitor multiple neurons simultaneously, once and for all answering debates about which parts of the brain do what, when, essentially elucidating the architecture and wiring topology of the brain. Today many anatomical studies have identified the different parts of the brain, and speculated on which are involved for various mental tasks, but with robotic probes it will be possible to finally unravel their complexity. Once achieved, genuine machine cognition will be possible by what I called cognizers in my 1988 book, which was 24 years ahead of its time: "Cognizers--Neural Networks and Machines that Think" (John Wiley & Sons)
I interviewed the researcher directly who hopes that the parts list and wiring diagrams they discover will be used by such projects in the future. Their first priority is curing brain diseases by characterizing the signature of abnormal neurons, but they will also characterize normal brain functions, which will be invaluable to projects like DARPA'S SyNAPSE.
We've all witnessed the first complete genome decoding and now the published genomes of countless organisms (and even colleagues). It is going to be exciting to see the milestones in brain emulation. Someday the brain of the flea will be fully emulated and gradually the reports will unfold of a complete cockroach, cat, and colleague brain emulation. It will be a scary day when our brains can be emulated on affordable devices and we have a conversation with "ourselves" and see how two "identical" brains diverge with slightly different inputs and experiences. Will we really want to argue with ourselves or compete against ourselves in a game of skill? Will we be able to send our spare brain to work on our behalf when we're feeling a little under the weather?
Figuring out the part list and interconnects do not equal understanding how it works. Just like plotting out the circuit diagram of a computer computer does not equal understanding how a CPU works. The brain of Einstein is very similar, if not exactly same, in design to all humans. The reason it worked better must be hidden beyond the mere part lists and wiring diagram. It is a necessary first step though.
You are right, except that they are also going to be able to determine the transfer functions by stimulating then measuring responses as well as monitor multiple points simultaneously to hopefully uncover algorithms. Of course, this is a long-range project and you are right that there is much more to it than just parts-list and wiring diagram :)
A very interesting topic...I appreciate the dedication of the research team who is going to turn the fiction into reality. I used to think about doing something similar (but might be different) in my childhood. I used to think if it is possible to decode a thought process happening in brain into electrical signals/codes...can this be then applied for "thought reading"...especially for animals? I used to think of applying the technology to my pet cat to enable her talking :)
I laughed at hearing about your fantasy of enabling your cat to talk, but the more I think about it, the more this seems feasible. By using the robotic probes to determine just which of her neurons are firing when she is hungry, for instance, it should be possible to rig up a vocal response when that feeling is present. Of course, the voice would not really be "hers," but the feeling of hunger would be, and that seems good enough for me (after all a talking robot has to borrow a human voice to speak too). Good idea!
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by