288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

May 15, 1998, Vol. 10, No. 4, Pages 1007-1030
(doi: 10.1162/089976698300017601)
© 1998 Massachusetts Institute of Technology
Efficient Adaptive Learning for Classification Tasks with Binary Units
Article PDF (304.87 KB)

This article presents a new incremental learning algorithm for classification tasks, called Net Lines, which is well adapted for both binary and real-valued input patterns. It generates small, compact feedforward neural networks with one hidden layer of binary units and binary output units. A convergence theorem ensures that solutions with a finite number of hidden units exist for both binary and real-valued input patterns. An implementation for problems with more than two classes, valid for any binary classifier, is proposed. The generalization error and the size of the resulting networks are compared to the best published results on well-known classification benchmarks. Early stopping is shown to decrease overfitting, without improving the generalization performance.