| |
Abstract:
Boosting methods maximize a hard classification margin and
are known as powerful techniques that do not exhibit overfitting
for low noise cases. Also for noisy data boosting will try to
enforce a hard margin and thereby give too much weight to outliers,
which then leads to the dilemma of non-smooth fits and overfitting.
Therefore we propose three algorithms to allow for soft margin
classification by introducing regularization with slack variables
into the boosting concept: (1) AdaBoost and regularized versions of
(2) linear and (3) quadratic programming AdaBoost. Experiments show
the usefulness of the proposed algorithms in comparison to another
soft margin classifier: the support vector machine.
|