Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

September 1994, Vol. 6, No. 5, Pages 877-884
(doi: 10.1162/neco.1994.6.5.877)
© 1994 Massachusetts Institute of Technology
Neural Nets with Superlinear VC-Dimension
Article PDF (438.02 KB)
Abstract

It has been known for quite a while that the Vapnik-Chervonenkis dimension (VC-dimension) of a feedforward neural net with linear threshold gates is at most O(w · log w), where w is the total number of weights in the neural net. We show in this paper that this bound is in fact asymptotically optimal. More precisely, we exhibit for any depth d ≥ 3 a large class of feedforward neural nets of depth d with w weights that have VC-dimension Ω(w · log w). This lower bound holds even if the inputs are restricted to Boolean values. The proof of this result relies on a new method that allows us to encode more “program-bits” in the weights of a neural net than previously thought possible.