288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

August 1, 2001, Vol. 13, No. 8, Pages 1863-1889
(doi: 10.1162/08997660152469387)
© 2001 Massachusetts Institute of Technology
Subspace Information Criterion for Model Selection
Article PDF (305.98 KB)

The problem of model selection is considerably important for acquiring higher levels of generalization capability in supervised learning. In this article, we propose a new criterion for model selection, the subspace information criterion (SIC), which is a generalization of Mallows's CL. It is assumed that the learning target function belongs to a specified functional Hilbert space and the generalization error is defined as the Hilbert space squared norm of the difference between the learning result function and target function. SIC gives an unbiased estimate of the generalization error so defined. SIC assumes the availability of an unbiased estimate of the target function and the noise covariance matrix, which are generally unknown. A practical calculation method of SIC for least-mean-squares learning is provided under the assumption that the dimension of the Hilbert space is less than the number of training examples. Finally, computer simulations in two examples show that SIC works well even when the number of training examples is small.