| |
Abstract:
multivariate technique both for dimension reduction and
classification. The data vectors are transformed into a low
dimensional subspace such that the class centroids are spread out
as much as possible. In this subspace LDA works as a simple
prototype classifier. The resulting decision boundaries are linear.
However, in many applications the linear boundaries do not
adequately separate the classes and the possibility of modelling
more complex boundaries would be desirable. In this paper we
present a nonlinear generalization of discriminant analysis that
implements the method of representing dot products of pattern
vectors by kernel functions. This technique allows to efficiently
compute discriminant functions in arbitrary feature spaces for
which such kernel representations exist.
|