| |
Abstract:
In kernel based learning the data is mapped to a kernel
feature space of a dimension that corresponds to the number of
training data points. In practice, however, the data forms a
smaller submanifold in feature space, a fact that has been used
e.g. by reduced set techniques for SVMs. We propose a new
mathematical construction that permits to adapt to the intrinsic
dimension and to find an orthonormal basis of this submanifold.
In doing so, computations get much simpler and more important our
theoretical framework allows to derive elegant kernelized blind
source separation (BSS) algorithms for arbitrary invertible
nonlinear mixings. Experiments demonstrate the good performance
and high computational efficiency of our kTDSEP algorithm for the
problem of nonlinear BSS.
|