Quarterly (spring, summer, fall, winter)
176 pp. per issue
7 x 10
2014 Impact factor:

Evolutionary Computation

Summer 1997, Vol. 5, No. 2, Pages 213-236
(doi: 10.1162/evco.1997.5.2.213)
© 1997 by the Massachusetts Institute of Technology
Evolutionary Induction of Sparse Neural Trees
Article PDF (1.42 MB)

This paper is concerned with the automatic induction of parsimonious neural networks. In contrast to other program induction situations, network induction entails parametric learning as well as structural adaptation. We present a novel representation scheme called neural trees that allows efficient learning of both network architectures and parameters by genetic search. A hybrid evolutionary method is developed for neural tree induction that combines genetic programming and the breeder genetic algorithm under the unified framework of the minimum description length principle. The method is successfully applied to the induction of higher order neural trees while still keeping the resulting structures sparse to ensure good generalization performance. Empirical results are provided on two chaotic time series prediction problems of practical interest.