Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

September 1995, Vol. 7, No. 5, Pages 1040-1053
(doi: 10.1162/neco.1995.7.5.1040)
© 1995 Massachusetts Institute of Technology
Lower Bounds on the VC Dimension of Smoothly Parameterized Function Classes
Article PDF (695.32 KB)
Abstract

We examine the relationship between the VC dimension and the number of parameters of a threshold smoothly parameterized function class. We show that the VC dimension of such a function class is at least k if there exists a k-dimensional differentiable manifold in the parameter space such that each member of the manifold corresponds to a different decision boundary. Using this result, we are able to obtain lower bounds on the VC dimension proportional to the number of parameters for several thresholded function classes including two-layer neural networks with certain smooth activation functions and radial basis functions with a gaussian basis. These lower bounds hold even if the magnitudes of the parameters are restricted to be arbitrarily small. In Valiant's probably approximately correct learning framework, this implies that the number of examples necessary for learning these function classes is at least linear in the number of parameters.