288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

September 1, 2003, Vol. 15, No. 9, Pages 2199-2226
(doi: 10.1162/089976603322297359)
© 2003 Massachusetts Institute of Technology
Activation Functions Defined on Higher-Dimensional Spaces for Approximation on Compact Sets With and Without Scaling
Article PDF (262.22 KB)

Let g be a slowly increasing function of locally bounded variation defined on Rc, 1 ≤c≤d. We investigate when g can be an activation function of the hidden-layer units of three-layer neural networks that approximate continuous functions on compact sets. If the support of the Fourier transform of g includes a converging sequence of points with distinct distances from the origin, it can be an activation function without scaling. If and only if the support of its Fourier transform includes a point other than the origin, it can be an activation function with scaling. We also look for a condition on which an activation function can be used for approximation without rotation. Any nonpolynomial functions can be activation functions with scaling, and many familiar functions, such as sigmoid functions and radial basis functions, can be activation functions without scaling. With or without scaling, some of them defined on Rd can be used without rotation even if they are not spherically symmetric.