288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

January 1, 2006, Vol. 18, No. 1, Pages 1-9
(doi: 10.1162/089976606774841576)
© 2005 Massachusetts Institute of Technology
Mixture Models Based on Neural Network Averaging
Article PDF (85.93 KB)

A modified version of the single hidden-layer perceptron architecture is proposed for modeling mixtures. A particular flexible mixture model is obtained by implementing the Box-Cox transformation as transfer function. In this case, the network response can be expressed in closed form as a weighted power mean. The quadratic Scheffé K-polynomial and the exponential Wilson equation turn out to be special forms of this general mixture model. Advantages of the proposed network architecture are that binary data sets suffice for “training” and that it is readily extended to incorporate additional mixture components while retaining all previously determined weights.