| |
Abstract:
There has been much recent work on learning probability
distributions on images and on image statistics. We observe that
the mapping from images to statistics is many-to-one and show it
can be quantified by a phase space factor. This phase space
approach throws light on the Minimax Entropy technique for learning
Gibbs distributions on images with potentials derived from image
statistics and elucidates the ambiguities that are inherent to
determining the potentials. In addition, it shows that if the phase
factor can be approximated by an analytic distribution then the
computation for Minimax entropy learning can be vastly reduced. An
illustration of this concept, using a Gaussian to approximate the
phase factor, leads to a new algorithm called "Minutemax," which
gives a good approximation to the results of Zhu and Mumford in
just seconds of CPU time. The phase space approach also gives
insight into the multi-scale potentials found by Zhu and Mumford
and suggest that the forms of the potentials are influenced greatly
by phase space considerations. Finally, we prove that probability
distributions learned in feature space alone are equivalent to
Minimax Entropy learning with a multinomial approximation of the
phase factor.
|