At IJCNN, IBM added the concept of "corelets" into their architecture and programming model for neurosyaptic cores. Corelets allow a high-level description language -- the Corlet Language (a high-level description language similar to VHDL) -- to program massively interconnected neuronsynaptic cores using nested, reusable building blocks. An individual corelet is a complete blueprint for a particular neuronsynaptic function -- such as motion detection -- but in a object-oriented package that hides its internal complexity from programmers.
IBM's Corelet Laboratory supports the complete development cycle for cognitive computers, from choosing an algorithm from the Corelet Library to running it on the Compass Simulator to connecting sensory inputs, processing them to generate outputs for pattern classification, visualizations, and to drive actuators. SOURCE: IBM
Corelets can be used at any level of a design -- using combinations of them to create larger, more complex functions or drilling down in a single one to add new functionality. IBM likens the Corelet Language to the seminal "formula translating" or Fortran language that IBM created on the same San Jose, California campus to popularize its first computers circa 1954. Now, these San Jose researchers are creating the Corelet Language to popularize IBM's next generation of cognitive computers.
"Our cognitive computer architecture necessitated a new programming model," said Modha. "What we have created is the Fortran of cognitive computers."
The Corelet Language -- implemented using Matlab's object-oriented programming model -- is part of an ecosystem of technology IBM is creating to support the development of software for its future cognitive computers. An integral simulator, called Compass, allows these massively parallel algorithms to be simulated on a traditional computer. A detailed neuron model allows this most basic computational building block of cognitive computers to support a wide range of spatio-temporal, multi-modal functions, from sensing to remembering to actuating.
A Corelet Library -- already containing 150 corelets -- is being amassed by IBM researchers to include all the parameterized, cognitive algorithms needed to link massively parallel, multi-modal, spatio-temporal sensors to the actuators that affect the real world. And the Corelet Laboratory knits together the lifecycle of algorithms on cognitive computers, including a teaching curriculum that allows programmers to quickly learn the necessary concepts.
So far, IBM has successfully proven out its Corelet Laboratory on seven applications: speaker recognition, music composer recognition, digit recognition, sequence prediction, collision avoidance, optical flow detection, and eye detection.
The corelet architecture and programming model -- available immediately to IBM employees, and later to its partners -- includes the entire end-to-end development cycle for cognitive algorithms, from choosing corelets from the application library, to simulating them, to connecting them to sensors (for input) and actuators (for output), to building the system, visualizing the results, then debugging and deploying the cognitive computer.
For more information, visit IBM's Neurosynaptic Chips site where the detailed papers presented at IJCNN can be downloaded.