Network: Computation in Neural Systems 20(3):137-161 (2009-09-01) (bibtex, paper.pdf)

Additive neurogenesis as a strategy for avoiding interference in a sparsely-coding dentate gyrus.

Peter A. Appleby and Laurenz Wiskott


Abstract: Recently we presented a model of additive neurogenesis in a linear, feedforward neural network that performed an encoding-decoding memory task in a changing input environment. Growing the neural network over time allowed the network to adapt to changes in input statistics without disrupting retrieval properties, and we proposed that adult neurogenesis might fulfil a similar computational role in the dentate gyrus of the hippocampus. Here we explicitly evaluate this hypothesis by examining additive neurogenesis in a simplified hippocampal memory model. The model incorporates a divergence in unit number from the entorhinal cortex to the dentate gyrus and sparse coding in the dentate gyrus, both notable features of hippocampal processing. We evaluate two distinct adaptation strategies; neuronal turnover, where the network is of fixed size but units may be deleted and new ones added, and additive neurogenesis, where the network grows over time, and quantify the performance of the network across the full range of adaptation levels from zero in a fixed network to one in a fully adapting network. We find that additive neurogenesis is always superior to neuronal turnover as it permits the network to be responsive to changes in input statistics while at the same time preserving representations of earlier environments.


Relevant Project:


October 28, 2010, Laurenz Wiskott, http://www.neuroinformatik.ruhr-uni-bochum.de/PEOPLE/wiskott/