| |
Abstract:
This paper presents a novel practical framework for Bayesian
model averaging and model selection in probabilistic graphical
models. Our approach approximates full posterior distributions over
model parameters and structures, as well as latent variables, in an
analytical manner. Unlike in large-sample approximations, these
posteriors are generally non-Gaussian and no Hessian needs to be
computed. The resulting algorithm generalizes the standard
Expectation Maximization algorithm, and its convergence is
guaranteed. We demonstrate that this approach can be applied to a
large class of graphical models in several domains, including
mixture models, hidden Markov models and blind source
separation.
|