Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

August 1, 2001, Vol. 13, No. 8, Pages 1891-1920
(doi: 10.1162/08997660152469396)
© 2001 Massachusetts Institute of Technology
Convergent Decomposition Techniques for Training RBF Neural Networks
Article PDF (178.49 KB)
Abstract

In this article we define globally convergent decomposition algorithms for supervised training of generalized radial basis function neural networks. First, we consider training algorithms based on the two-block decomposition of the network parameters into the vector of weights and the vector of centers. Then we define a decomposition algorithm in which the selection of the center locations is split into sequential minimizations with respect to each center, and we give a suitable criterion for choosing the centers that must be updated at each step. We prove the global convergence of the proposed algorithms and report the computational results obtained for a set of test problems.