Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

July 1, 1999, Vol. 11, No. 5, Pages 1183-1198
(doi: 10.1162/089976699300016403)
© 1999 Massachusetts Institute of Technology
On the Approximation Rate of Hierarchical Mixtures-of-Experts for Generalized Linear Models
Article PDF (100.28 KB)
Abstract

We investigate a class of hierarchical mixtures-of-experts (HME) models where generalized linear models with nonlinear mean functions of the form ψ(α + xTβ) are mixed. Here ψ(·) is the inverse link function. It is shown that mixtures of such mean functions can approximate a class of smooth functions of the form ψ(h(x)), where h(·) ε W2;k (a Sobolev class over [0, 1]s, as the number of experts m in the network increases. An upper bound of the approximation rate is given as O(m−2/s) in Lp norm. This rate can be achieved within the family of HME structures with no more than s-layers, where s is the dimension of the predictor x.