| |
Abstract:
We present a probabilistic approach for fusion of video
sequences produced by multiple imaging sensors. We model the sensor
images as noisy, locally affine functions of an underlying true
scene. Maximum likelihood estimates of the parameters in the local
affine functions are based on the local covariance of the image
data, and therefore related to local principal component analysis.
With the model parameters estimated, a Bayesian framework provides
either maximum likelihood or maximum a posteriori estimates of the
true scene from the sensor images. These true scene estimates
comprise the sensor fusion rules. We demonstrate the efficacy of
the approach on sequences of images from visible-band and infrared
sensors.
|