MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Training Methods for Adaptive Boosting of Neural Networks

 Holger Schwenk and Yoshua Bengio
  
 

Abstract:
Boosting is a general method for improving the performance of any learning algorithm that consistently generates classifiers which need to perform only slightly better than random guessing. A recently proposed and very promising boosting algorithm is AdaBoost. It has been applied with great success to several benchmark machine learning problems using rather simple learning algorithms, in particular decision trees. In this paper we use AdaBoost to improve the performances of neural networks applied to character recognition tasks. We compare training methods based on sampling the training set and weighting the cost function. Our system achieves about 1.4 error on a data base of online handwritten digits from more than 200 writers. Adaptive boosting of a multi-layer network achieved less than 2 error on the UCI Letters offline characters data set.

 
 


© 2010 The MIT Press
MIT Logo