| |
Abstract:
In this paper we will show that a restricted class of
constrained minimum divergence problems, named generalized
inference problems, can be solved by approximating the KL
divergence with a Bethe free energy. The algorithm we derive is
closely related to both loopy belief propagation and iterative
scaling. This
unified propagation and scaling
algorithm reduces to a convergent alternative to loopy belief
propagation when no constraints are present. Experiments show the
viability of our algorithm.
|