LAKE WALES, Fla. — Why can human brains learn new things each day without dumping old memories to make room for the new? Writers of machine-learning algorithms that mimic the human brain are pursuing a number of answers. A relatively new algorithmic solution is based on biologists’ most recent findings on neurogenesis, or the growth of new and destruction of old brain cells (neurons) and their big or small connections (strong or weak synapses, respectively), primarily in the hippocampus.
A team at IBM Research (Yorktown Heights, N.Y.) hypothesizes that neurogenesis is the key to sparse dictionary learning in the hippocampus, whose function, they believe, is to keep the most efficient brain codes up to date. To prove their hypothesis about the function of neuronal birth and death in the hippocampus, IBM researchers wrote a neurogenetic online dictionary learning (NODL) algorithm that outperformed standard ODL, improving reconstruction accuracy by creating more-compact representations.
IBM presented the work at the International Joint Conference on Artificial Intelligence (IJCAI 2017), held this month in Melbourne, Australia.
“People used to think that you were born with all your brain cells — that they could die but were never born anew. But it turns out there are many stem cells in the brain that can be turned on, for instance, to replace damaged cells. And in the hippocampus they are turned on all the time, to make new neurons for better brain codes. Likewise, poor encodings allow the death of the neurons making the poor codes,” Irina Rish, a research scientist with IBM Research’s new AI Foundations effort, told EE Times in an exclusive interview.
Standardized image data sets used for the evaluation of the IBM team’s neurogenetic online dictionary learning algorithm. NODL (blue) outperformed standard ODL (purple), improving reconstruction accuracy by creating to more-compact representations.
(Source: IBM Research)
Rish is a computer scientist who specializes in artificial intelligence. She started her career at IBM in its machine-learning group, working on research that aimed to use AI to diagnose disease. She moved to IBM’s computational neuroscience group before joining the AI Foundations Lab, which specializes in writing algorithms to simulate the subtle aspects of brain function beyond deep learning.
“We are the first group to make a detailed analysis of the functions of birth and death of neurons from stem cells in the hippocampus,” Rish said.
In “Neurogenesis-Inspired Dictionary Learning: Online Model Adaption in a Changing World,” presented at IJCAI, IBM reported that its neurogenetic online dictionary learning model had an enhanced ability to adapt to changing environments and exhibited performance-accuracy improvements over standard ODL (see charts).
Rish’s group does not try to mimic neurogenesis in the hippocampus, as the European Union’s Blue Brain project is working to do; rather, it seeks to uncover neurogenetic functions and then cast those into software algorithms that perform the same functions. In more detail, Rish’s group hypothesizes that the dentate gyrusof the hippocampus can improve cognitive functions, including pattern separation and recognition.
Deep-learning algorithms commonly mimic the brain’s neuroplasticity — the strengthening of synapses with increased use and the atrophy of synapses that are seldom used. Neurogenesis adds a new dimension to machine learning by allowing newly grown sparse neural networks to accomplish completely new tasks, and by enabling the complete erasure of obsolete tasks through the death of old, seldom-used neurons. This enables completely new “dictionaries” of perceived feature sets to be defined, and obsolete ones to be scrapped, in response to changing environments.
“Neurogenesis explains how lifelong, continual learning is accomplished — how new, compressed data encodings can distill down previously very large architectures, such as replacing giant deep neural networks with streamlined ones that perform faster and better than the original,” Rish said. IBM concludes that its NODL model is particularly beneficial in today’s rapidly changing environments and that neuronal death is just as important as neuronal birth and synaptic-based deep learning.
— R. Colin Johnson, Advanced Technology Editor, EE Times