| |
Abstract:
We give results about the learnability and required complexity
of logical formulae to solve classification problems. These
results are obtained by linking propositional logic with kernel
machines. In particular we show that decision trees and
disjunctive normal forms (DNF) can be represented by the help of
a special kernel, linking regularized risk to separation margin.
Subsequently we derive a number of lower bounds on the required
complexity of logic formulae using properties of algorithms for
generation of linear estimators, such as perceptron and maximal
perceptron learning.
|