| |
Abstract:
Graphical models provide a broad framework for probabilistic
inference, with application to such diverse areas as speech
recognition (Hidden Markov Models), medical diagnosis (Belief
networks) and artificial intelligence (Boltzmann Machines).
However, the computing time is typically exponential in the number
of nodes in the graph. We present a general framework for a class
of approximating models, based on the Kullback-Leibler divergence
between an approximating graph and the original graph. In addition
to unifying the node-elimination and structural variational
frameworks, we provide generalised mean-field equations for both
directed and undirected graphs. Simulation results on a small
benchmark problem suggest that this method compares favourably
against others previously reported in the literature.
|