Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

June 2015, Vol. 27, No. 6, Pages 1321-1344
(doi: 10.1162/NECO_a_00740)
© 2015 Massachusetts Institute of Technology
Timescale Separation in Recurrent Neural Networks
Article PDF (188.93 KB)
Abstract

Supervised learning in recurrent neural networks involves two processes: the neuron activity from which gradients are estimated and the process on connection parameters induced by these measurements. A problem such algorithms must address is how to balance the relative rates of these activities so that accurate sensitivity estimates are obtained while still allowing synaptic modification to take place at a rate sufficient for learning. We show how to calculate a sufficient timescale separation between these two processes for a class of contracting neural networks.