Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

October 2009, Vol. 21, No. 10, Pages 2970-2989
(doi: 10.1162/neco.2009.04-08-745)
© 2009 Massachusetts Institute of Technology
An Integral Upper Bound for Neural Network Approximation
Article PDF (157.35 KB)
Abstract

Complexity of one-hidden-layer networks is studied using tools from nonlinear approximation and integration theory. For functions with suitable integral representations in the form of networks with infinitely many hidden units, upper bounds are derived on the speed of decrease of approximation error as the number of network units increases. These bounds are obtained for various norms using the framework of Bochner integration. Results are applied to perceptron networks.