Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

November 15, 1998, Vol. 10, No. 8, Pages 2137-2157
(doi: 10.1162/089976698300017007)
© 1998 Massachusetts Institute of Technology
Complexity Issues in Natural Gradient Descent Method for Training Multilayer Perceptrons
Article PDF (141.2 KB)
Abstract

The natural gradient descent method is applied to train an n-m-1 multilayer perceptron. Based on an efficient scheme to represent the Fisher information matrix for an n-m-1 stochastic multilayer perceptron, a new algorithm is proposed to calculate the natural gradient without inverting the Fisher information matrix explicitly. When the input dimension n is much larger than the number of hidden neurons m, the time complexity of computing the natural gradient is O(n).