288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

July 2008, Vol. 20, No. 7, Pages 1706-1716
(doi: 10.1162/neco.2008.10-06-351)
© 2008 Massachusetts Institute of Technology
Online Learning with Hidden Markov Models
Article PDF (336.78 KB)

We present an online version of the expectation-maximization (EM) algorithm for hidden Markov models (HMMs). The sufficient statistics required for parameters estimation is computed recursively with time, that is, in an online way instead of using the batch forward-backward procedure. This computational scheme is generalized to the case where the model parameters can change with time by introducing a discount factor into the recurrence relations. The resulting algorithm is equivalent to the batch EM algorithm, for appropriate discount factor and scheduling of parameters update. On the other hand, the online algorithm is able to deal with dynamic environments, i.e., when the statistics of the observed data is changing with time. The implications of the online algorithm for probabilistic modeling in neuroscience are briefly discussed.