We've all witnessed the first complete genome decoding and now the published genomes of countless organisms (and even colleagues). It is going to be exciting to see the milestones in brain emulation. Someday the brain of the flea will be fully emulated and gradually the reports will unfold of a complete cockroach, cat, and colleague brain emulation. It will be a scary day when our brains can be emulated on affordable devices and we have a conversation with "ourselves" and see how two "identical" brains diverge with slightly different inputs and experiences. Will we really want to argue with ourselves or compete against ourselves in a game of skill? Will we be able to send our spare brain to work on our behalf when we're feeling a little under the weather?
What these robotic tools will enable is the precise characterization of how each type of neuron works while it is performing specific tasks. The most important of these is learning. There are thousands of different types of neurons and hundreds of neurotransmitter chemicals handing-off messages among neurons of the brain, yet until now it has been impossible to determine just which are dong what, when. Because the patient is still alive while the robotic probe is in place, it will now be possible to see which neurons are involved in learning, motor control and when processing all the various sensor data streams coming in from eyes, ears, nose, tongue and skin. The second important enabler here is the ability to monitor multiple neurons simultaneously, once and for all answering debates about which parts of the brain do what, when, essentially elucidating the architecture and wiring topology of the brain. Today many anatomical studies have identified the different parts of the brain, and speculated on which are involved for various mental tasks, but with robotic probes it will be possible to finally unravel their complexity. Once achieved, genuine machine cognition will be possible by what I called cognizers in my 1988 book, which was 24 years ahead of its time: "Cognizers--Neural Networks and Machines that Think" (John Wiley & Sons)
The probing and monitoring are no doubt useful for neurosurgery - just like the article said, to eliminate trial and error of doing surgery. In the near future, an artificial limb directly connected to neurons will be achieved. Will there be a chance that doping can be done by firing signal to certain neuron to boost endurance and strength? Will hybrid human, half robot and half human, be achieved in the foreseeable future? When human has totally understanding of human brain, the AI will have a big lap. Watch Out! The terminator is coming. ;)
As long as it is only measuring and monitoring of brain activity - it is OK. But the next step could be to modify the behavior of the neurons and that could lead to some disastrous implications and hideous tools in the hands of criminal or crime investigating minds
It seems that fictions are going to be turned in to realities very soon, if this researches results fruitful. Cognitive Computing will be very good but it will be really very difficult for the debuggers to find the problems in the systems once designed.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.