288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

May 2017, Vol. 29, No. 5, Pages 1375-1405
(doi: 10.1162/NECO_a_00954)
© 2017 Massachusetts Institute of Technology
Multiassociative Memory: Recurrent Synapses Increase Storage Capacity
Article PDF (897.62 KB)

The connection density of nearby neurons in the cortex has been observed to be around 0.1, whereas the longer-range connections are present with much sparser density (Kalisman, Silberberg, & Markram, 2005). We propose a memory association model that qualitatively explains these empirical observations. The model we consider is a multiassociative, sparse, Willshaw-like model consisting of binary threshold neurons and binary synapses. It uses recurrent synapses for iterative retrieval of stored memories. We quantify the usefulness of recurrent synapses by simulating the model for small network sizes and by doing a precise mathematical analysis for large network sizes. Given the network parameters, we can determine the precise values of recurrent and afferent synapse densities that optimize the storage capacity of the network. If the network size is like that of a cortical column, then the predicted optimal recurrent density lies in a range that is compatible with biological measurements. Furthermore, we show that our model is able to surpass the standard Willshaw model in the multiassociative case if the information capacity is normalized per strong synapse or per bits required to store the model, as considered in Knoblauch, Palm, and Sommer (2010).