| |
Abstract:
A latent variable generative model with finite noise is used
to describe several different algorithms for Independent Components
Analysis (ICA). In particular, the Fixed Point ICA algorithm is
shown to be equivalent to the Expectation-Maximization algorithm
for maximum likelihood under certain constraints, allowing the
conditions for global convergence to be elucidated. The algorithms
can also be explained by their generic behavior near a singular
point where the size of the optimal generative bases vanishes. An
expansion of the likelihood about this singular point indicates the
role of higher order correlations in determining the features
discovered by ICA. The application and convergence of these
algorithms are demonstrated on the learning of edge features as the
independent components of natural images.
|