288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

January 1, 1998, Vol. 10, No. 1, Pages 165-188
(doi: 10.1162/089976698300017935)
© 1997 Massachusetts Institute of Technology
A Low-Sensitivity Recurrent Neural Network
Article PDF (232.46 KB)

The problem of high sensitivity in modeling is well known. Small perturbations in the model parameters may result in large, undesired changes in the model behavior. A number of authors have considered the issue of sensitivity in feedforward neural networks from a probabilistic perspective. Less attention has been given to such issues in recurrent neural networks. In this article, we present a new recurrent neural network architecture, that is capable of significantly improved parameter sensitivity properties compared to existing recurrent neural networks. The new recurrent neural network generalizes previous architectures by employing alternative discrete-time operators in place of the shift operator normally used. An analysis of the model demonstrates the existence of parameter sensitivity in recurrent neural networks and supports the proposed architecture. The new architecture performs significantly better than previous recurrent neural networks, as shown by a series of simple numerical experiments.