288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

December 2008, Vol. 20, No. 12, Pages 3111-3130
(doi: 10.1162/neco.2008.04-07-502)
© 2008 Massachusetts Institute of Technology
Sleeping Our Way to Weight Normalization and Stable Learning
Article PDF (688.47 KB)

The functions of sleep have been an enduring mystery. Tononi and Cirelli (2003) hypothesized that one of the functions of slow-wave sleep is to scale down synapses in the cortex that have strengthened during awake learning. We create a computational model to test the functionality of this idea and examine some of its implications. We show that synaptic scaling during slow-wave sleep is capable of keeping Hebbian learning in check and that it enables stable development. We also show theoretically how it implements classical weight normalization, which has been in common use in neural models for decades. Finally, a significant computational limitation of this form of synaptic scaling is revealed through computer simulations.