| |
Abstract:
Guided by an initial idea of building a complex (non linear)
decision surface with maximal
local margin
in input space, we give a possible geometrical intuition as to
why K-Nearest Neighbor (KNN) algorithms often perform more poorly
than SVMs on classification tasks. We then propose modified
K-Nearest Neighbor algorithms to overcome the perceived problem.
The approach is similar in spirit to
Tangent Distance
, but with invariances inferred from the local neighborhood
rather than prior knowledge. Experimental results on real world
classification tasks suggest that the modified KNN algorithms
often give a dramatic improvement over standard KNN and perform
as well or better than SVMs.
|