Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

October 2020, Vol. 32, No. 10, Pages 1980-1997
(doi: 10.1162/neco_a_01313)
© 2020 Massachusetts Institute of Technology
Analysis of Regression Algorithms with Unbounded Sampling
Article PDF (279.81 KB)
Abstract
In this letter, we study a class of the regularized regression algorithms when the sampling process is unbounded. By choosing different loss functions, the learning algorithms can include a wide range of commonly used algorithms for regression. Unlike the prior work on theoretical analysis of unbounded sampling, no constraint on the output variables is specified in our setting. By an elegant error analysis, we prove consistency and finite sample bounds on the excess risk of the proposed algorithms under regular conditions.