Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

September 2009, Vol. 21, No. 9, Pages 2667-2686
(doi: 10.1162/neco.2009.07-08-809)
© 2009 Massachusetts Institute of Technology
Limited Stochastic Meta-Descent for Kernel-Based Online Learning
Article PDF (214.81 KB)
Abstract

To improve the single-run performance of online learning and reinforce its stability, we consider online learning with limited adaptive learning rate in this letter. The letter extends convergence proofs for NORMA to a range of step sizes, then employs support vector learning with stochastic meta-descent (SVMD) limited to that range for step size adaptation, so as to obtain an online kernel algorithm that combines theoretical convergence guarantees with good practical performance. Experiments on different data sets corroborate theoretical results well and show that our method is another promising way for online learning.