288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

March 1992, Vol. 4, No. 2, Pages 141-166
(doi: 10.1162/neco.1992.4.2.141)
© 1992 Massachusetts Institute of Technology
First- and Second-Order Methods for Learning: Between Steepest Descent and Newton's Method
Article PDF (1.41 MB)

On-line first-order backpropagation is sufficiently fast and effective for many large-scale classification problems but for very high precision mappings, batch processing may be the method of choice. This paper reviews first- and second-order optimization methods for learning in feedforward neural networks. The viewpoint is that of optimization: many methods can be cast in the language of optimization techniques, allowing the transfer to neural nets of detailed results about computational complexity and safety procedures to ensure convergence and to avoid numerical problems. The review is not intended to deliver detailed prescriptions for the most appropriate methods in specific applications, but to illustrate the main characteristics of the different methods and their mutual relations.