288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

November 1, 2004, Vol. 16, No. 11, Pages 2459-2481
(doi: 10.1162/0899766041941880)
© 2004 Massachusetts Institute of Technology
Principal Components Analysis Competitive Learning
Article PDF (135.1 KB)

We present a new neural model that extends the classical competitive learning by performing a principal components analysis (PCA) at each neuron. This model represents an improvement with respect to known local PCA methods, because it is not needed to present the entire data set to the network on each computing step. This allows a fast execution while retaining the dimensionality-reduction properties of the PCA. Furthermore, every neuron is able to modify its behavior to adapt to the local dimensionality of the input distribution. Hence, our model has a dimensionality estimation capability. The experimental results we present show the dimensionality-reduction capabilities of the model with multisensor images.