288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

February 2008, Vol. 20, No. 2, Pages 452-485
(doi: 10.1162/neco.2007.07-06-297)
© 2008 Massachusetts Institute of Technology
Design of Continuous Attractor Networks with Monotonic Tuning Using a Symmetry Principle
Article PDF (2.3 MB)

Neurons that sustain elevated firing in the absence of stimuli have been found in many neural systems. In graded persistent activity, neurons can sustain firing at many levels, suggesting a widely found type of network dynamics in which networks can relax to any one of a continuum of stationary states. The reproduction of these findings in model networks of nonlinear neurons has turned out to be nontrivial. A particularly insightful model has been the “bump attractor,” in which a continuous attractor emerges through an underlying symmetry in the network connectivity matrix. This model, however, cannot account for data in which the persistent firing of neurons is a monotonic—rather than a bell-shaped—function of a stored variable. Here, we show that the symmetry used in the bump attractor network can be employed to create a whole family of continuous attractor networks, including those with monotonic tuning. Our design is based on tuning the external inputs to networks that have a connectivity matrix with Toeplitz symmetry. In particular, we provide a complete analytical solution of a line attractor network with monotonic tuning and show that for many other networks, the numerical tuning of synaptic weights reduces to the computation of a single parameter.