Monthly
288 pp. per issue
6 x 9, illustrated
ISSN
0899-7667
E-ISSN
1530-888X
2014 Impact factor:
2.21

Neural Computation

August 2006, Vol. 18, No. 8, Pages 1932-1950
(doi: 10.1162/neco.2006.18.8.1932)
© 2006 Massachusetts Institute of Technology
Kernel Projection Classifiers with Suppressing Features of Other Classes
Article PDF (153.46 KB)
Abstract

We propose a new classification method based on a kernel technique called suppressed kernel sample space projection classifier (SKSP), which is extended from kernel sample space projection classifier (KSP).

In kernel methods, samples are classified after they are mapped from an input space to a high-dimensional space called a feature space. The space that is spanned by samples of a class in a feature space is defined as a kernel sample space. In KSP, an unknown input vector is classified to the class of which projection norm onto the kernel sample space is maximized. KSP can be interpreted as a special type of kernel principal component analysis (KPCA). KPCA is also used in classification problems. However, KSP has more useful properties compared with KPCA, and its accuracy is almost the same as or better than that of KPCA classifier.

Since KSP is a single-class classifier, it uses only self-class samples for learning. Thus, for a multiclass classification problem, even though there are very many classes, the computational cost does not change for each class. However, we expect that more useful features can be obtained for classification if samples from other classes are used. By extending KSP to SKSP, the effects of other classes are suppressed, and useful features can be extracted with an oblique projection.

Experiments on two-class classification problems, indicate that SKSP shows high accuracy in many classification problems.