Neural Computation, 10(3):671-716 (1998-04-) (bibtex, paper.pdf,

Reprinted: In Self-organizing map formation: Foundations of neural computation., eds. Obermayer, K., and Sejnowski, T. J., Cambridge MA, MIT Press, pp. 83-128 (2001) (bibtex)

Constrained optimization for neural map formation: A unifying framework for weight growth and normalization.

Laurenz Wiskott and Terrence J. Sejnowski

Abstract: Computational models of neural map formation can be considered on at least three different levels of abstraction: detailed models including neural activity dynamics, weight dynamics that abstract from the neural activity dynamics by an adiabatic approximation, and constrained optimization from which equations governing weight dynamics can be derived. Constrained optimization uses an objective function, from which a weight growth rule can be derived as a gradient flow, and some constraints, from which normalization rules are derived. In this paper we present an example of how an optimization problem can be derived from detailed non-linear neural dynamics. A systematic investigation reveals how different weight dynamics introduced previously can be derived from two types of objective function terms and two types of constraints. This includes dynamic link matching as a special case of neural map formation. We focus in particular on the role of coordinate transformations to derive different weight dynamics from the same optimization problem. Several examples illustrate how the constrained optimization framework can help in understanding, generating, and comparing different models of neural map formation. The techniques used in this analysis may also be useful in investigating other types of neural dynamics.

Keywords: neural map formation, objective functions, constraints, coordinate transformations

Relevant Project:

February 2, 2001, Laurenz Wiskott,