IBM's new device mimics the neurons and synapses in the human brain. A new programming paradigm facilitates the development of sensory-based, cognitive computing applications.
IBM is releasing to early adopters a neuro-synaptic computation chip that mimics the neurons and synapses of the brain. This chip is based on a new neuron model developed by IBM researchers.
Using artificial neural networks is not new -- what is new is the innovative neuron model approach developed by the IBM team, and putting everything in a high-density, low-power ASIC. Each of the artificial neurons requires about 1,200 ASIC gates coupled with a synapse implemented using a cross-bar-type RAM memory. Many of these artificial neurons and synapses can fit in a single device where they perform extreme parallel processing.
Up until now, most computer chips have employed a von Neumann-type architecture with an ALU and RAM. These devices execute instructions in series, with multiple CPUs and/or ALU pipelining used to improve performance for a given precision. By comparison, IBM's device mimics the neurons in the human brain, using an "integrate up the inputs" function and then firing an output pulse onto the output network. With the addition of feedback, and other constants and variables, various sophisticated transfer functions can be realized.
The chip started out as a software model running in one of the "Blue-Gene" series of super computers. It was then moved into FPGAs a few years back, and now into its own ASIC implemented using a 45nm process.
New chips, new programming language
With these new neurosynaptic chips comes the need for a new way to write code for sensory and other applications. This lead to a language based on neurosynaptic cores as the basic building block. The language also supports modules called "corelets" that can perform higher-level functions. These pre-defined functions can subsequently be combined to perform more complex tasks. To this end, the IBM team has created corelets for a variety of functions that are typically performed by the brain.
Some of the proposed sensory applications include things like artificial noses, ears, and eyes. IBM has now released the tools for evaluation and is looking for interested developers.
What are the plusses and minuses you see with this new technology? Will this be "shades of HAL" from 2001 A Space Odyssey, or does it herald a grand, new phase for the people of Earth? Would you consider using a device of this type for one of your own applications?