Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

July 1, 1999, Vol. 11, No. 5, Pages 1069-1077
(doi: 10.1162/089976699300016340)
© 1999 Massachusetts Institute of Technology
Relating the Slope of the Activation Function and the Learning Rate Within a Recurrent Neural Network
Article PDF (49 KB)
Abstract

A relationship between the learning rate η in the learning algorithm, and the slope β in the nonlinear activation function, for a class of recurrent neural networks (RNNs) trained by the real-time recurrent learning algorithm is provided. It is shown that an arbitrary RNN can be obtained via the referent RNN, with some deterministic rules imposed on its weights and the learning rate. Such relationships reduce the number of degrees of freedom when solving the nonlinear optimization task of finding the optimal RNN parameters.