Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

June 1, 2000, Vol. 12, No. 6, Pages 1411-1427
(doi: 10.1162/089976600300015439)
© 2000 Massachusetts Institute of Technology
Nonmonotonic Generalization Bias of Gaussian Mixture Models
Article PDF (301.25 KB)
Abstract

Theories of learning and generalization hold that the generalization bias, defined as the difference between the training error and the generalization error, increases on average with the number of adaptive parameters. This article, however, shows that this general tendency is violated for a gaussian mixture model. For temperatures just below the first symmetry breaking point, the effective number of adaptive parameters increases and the generalization bias decreases. We compute the dependence of the neural information criterion on temperature around the symmetry breaking. Our results are confirmed by numerical cross-validation experiments.