MIT CogNet, The Brain Sciences ConnectionFrom the MIT Press, Link to Online Catalog
SPARC Communities
Subscriber : Stanford University Libraries » LOG IN

space

Powered By Google 
Advanced Search

 

Global Optimisation of Neural Network Models Via Sequential Sampling

 Nando de Freitas, Mahesan Niranjan, Andrew Gee and Arnaud Doucet
  
 

Abstract:
We propose a novel strategy for training neural networks using sequential Monte Carlo algorithms. In particular, we discuss an efficient hybrid gradient descent/sampling importance resampling algorithm that we developed recently, under the name of hybrid SIR. This global optimisation approach allows us to learn the probability distribution of the network weights and outputs in a sequential framework. It is well suited to applications involving on-line, nonlinear, non-Gaussian or non-stationary signal processing. We show how the new algorithm outperforms extended Kalman filter training on several simulated examples and on a real application involving the pricing of option contracts traded in financial markets.

 
 


© 2010 The MIT Press
MIT Logo