In Hasler and Marr's roadmap they describe the specific computational milestones that have already been achieved -- such as the single-transistor synapse and the FPAA -- as well as the algorithms and computational models that have been proven out so far. The roadmap then proceeds to detail the areal density of artificial neurons and synapses that will be necessary to realize a low-power neuromorphic system whose size rivals that of a human brain.
FPAA developed at Georgia Tech by professor Jennifer Hasler as a power-efficient, mixed-signal SoC for neuromorphic computing.
(Source: Rob Felt courtesy of Georgia Tech)
Finally, the roadmap addresses the software tools that will be necessary to design these neuromorphic chips as well as the learning techniques, network topologies, novel interconnection devices -- such as memristors -- and how these components could be developed into usable chip arrays. For instance, memristors could serve for slow-scale modulatory parameters in neuromorphic systems.
Throughout, the roadmap emphasizes a modular approach, such as successfully emulating one layer of a human brain cortex before attempting multilayer devices, as well as the major engineering hurdles that need to be surmounted, such as using local-interconnection techniques to reduce the complexity of communications traffic among billions of neurons and trillions of synapses. The paper's overall conclusion is that "useful neural computation machines based on biological principles at the size of the human brain seems technically within our grasp."
Funding was provided, in part, by DARPA's SyNAPSE program.
— R. Colin Johnson, Advanced Technology Editor, EE Times