Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

Winter 1990, Vol. 2, No. 4, Pages 490-501
(doi: 10.1162/neco.1990.2.4.490)
© 1990 Massachusetts Institute of Technology
An Efficient Gradient-Based Algorithm for On-Line Training of Recurrent Network Trajectories
Article PDF (674.93 KB)
Abstract

A novel variant of the familiar backpropagation-through-time approach to training recurrent networks is described. This algorithm is intended to be used on arbitrary recurrent networks that run continually without ever being reset to an initial state, and it is specifically designed for computationally efficient computer implementation. This algorithm can be viewed as a cross between epochwise backpropagation through time, which is not appropriate for continually running networks, and the widely used on-line gradient approximation technique of truncated backpropagation through time.