Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

December 2016, Vol. 28, No. 12, Pages 2656-2686
(doi: 10.1162/NECO_a_00900)
© 2016 Massachusetts Institute of Technology
Efficient Neural Codes That Minimize Lp Reconstruction Error
Article PDF (1.1 MB)
Abstract

The efficient coding hypothesis assumes that biological sensory systems use neural codes that are optimized to best possibly represent the stimuli that occur in their environment. Most common models use information–theoretic measures, whereas alternative formulations propose incorporating downstream decoding performance. Here we provide a systematic evaluation of different optimality criteria using a parametric formulation of the efficient coding problem based on the reconstruction error of the maximum likelihood decoder. This parametric family includes both the information maximization criterion and squared decoding error as special cases. We analytically derived the optimal tuning curve of a single neuron encoding a one-dimensional stimulus with an arbitrary input distribution. We show how the result can be generalized to a class of neural populations by introducing the concept of a meta–tuning curve. The predictions of our framework are tested against previously measured characteristics of some early visual systems found in biology. We find solutions that correspond to low values of , suggesting that across different animal models, neural representations in the early visual pathways optimize similar criteria about natural stimuli that are relatively close to the information maximization criterion.