Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

March 2018, Vol. 30, No. 3, Pages 708-722
(doi: 10.1162/neco_a_01052)
© 2018 Massachusetts Institute of Technology
Indistinguishable Synapses Lead to Sparse Networks
Article PDF (494.86 KB)
Abstract
Neurons integrate information from many neighbors when they process information. Inputs to a given neuron are thus indistinguishable from one another. Under the assumption that neurons maximize their information storage, indistinguishability is shown to place a strong constraint on the distribution of strengths between neurons. The distribution of individual synapse strengths is found to follow a modified Boltzmann distribution with strength proportional to . The model is shown to be consistent with experimental data from Caenorhabditis elegans connectivity and in vivo synaptic strength measurements. The dependence helps account for the observation of many zero or weak connections between neurons or sparsity of the neural network.