| |
Abstract:
The nearest neighbor technique is a simple and appealing
method to address classification problems. It relies on the
assumption of locally constant class conditional probabilities.
This assumption becomes invalid in high dimensions with a finite
number of examples due to the curse of dimensionality. We propose
a technique that computes a locally flexible metric by means of
Support Vector Machines (SVMs). The maximum margin boundary found
by the SVM is used to determine the most discriminant direction
over the query's neighborhood. Such direction provides a local
weighting scheme for input features. We present experimental
evidence of classification performance improvement over the SVM
algorithm alone and over a variety of adaptive learning schemes,
by using both simulated and real data sets.
|