288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

March 2012, Vol. 24, No. 3, Pages 607-610
(doi: 10.1162/NECO_a_00248)
© 2011 Massachusetts Institute of Technology
Efficient Calculation of the Gauss-Newton Approximation of the Hessian Matrix in Neural Networks
Article PDF (54.47 KB)

The Levenberg-Marquardt (LM) learning algorithm is a popular algorithm for training neural networks; however, for large neural networks, it becomes prohibitively expensive in terms of running time and memory requirements. The most time-critical step of the algorithm is the calculation of the Gauss-Newton matrix, which is formed by multiplying two large Jacobian matrices together. We propose a method that uses backpropagation to reduce the time of this matrix-matrix multiplication. This reduces the overall asymptotic running time of the LM algorithm by a factor of the order of the number of output nodes in the neural network.