Neural Computation

August 2011, Vol. 23, No. 8, Pages 1935-1943
(doi: 10.1162/NECO_a_00152)
© 2011 Massachusetts Institute of Technology
A Simple Derivation of a Bound on the Perceptron Margin Using Singular Value Decomposition
Article PDF (110.56 KB)
Abstract

The perceptron is a simple supervised algorithm to train a linear classifier that has been analyzed and used extensively. The classifier separates the data into two groups using a decision hyperplane, with the margin between the data and the hyperplane determining the classifier's ability to generalize and its robustness to input noise. Exact results for the maximal size of the separating margin are known for specific input distributions, and bounds exist for arbitrary distributions, but both rely on lengthy statistical mechanics calculations carried out in the limit of infinite input size. Here we present a short analysis of perceptron classification using singular value decomposition. We provide a simple derivation of a lower bound on the margin and an explicit formula for the perceptron weights that converges to the optimal result for large separating margins.