Quarterly (spring, summer, fall, winter)
176 pp. per issue
7 x 10
2014 Impact factor:

Evolutionary Computation

Summer 2007, Vol. 15, No. 2, Pages 133-168
(doi: 10.1162/evco.2007.15.2.133)
© 2007 by the Massachusetts Institute of Technology
Generalization in the XCSF Classifier System: Analysis, Improvement, and Extension
Article PDF (478.69 KB)

We analyze generalization in XCSF and introduce three improvements. We begin by showing that the types of generalizations evolved by XCSF can be influenced by the input range. To explain these results we present a theoretical analysis of the convergence of classifier weights in XCSF which highlights a broader issue. In XCSF, because of the mathematical properties of the Widrow-Hoff update, the convergence of classifier weights in a given subspace can be slow when the spread of the eigenvalues of the autocorrelation matrix associated with each classifier is large. As a major consequence, the system's accuracy pressure may act before classifier weights are adequately updated, so that XCSF may evolve piecewise constant approximations, instead of the intended, and more efficient, piecewise linear ones. We propose three different ways to update classifier weights in XCSF so as to increase the generalization capabilities of XCSF: one based on a condition-based normalization of the inputs, one based on linear least squares, and one based on the recursive version of linear least squares. Through a series of experiments we show that while all three approaches significantly improve XCSF, least squares approaches appear to be best performing and most robust. Finally we show how XCSF can be extended to include polynomial approximations.