Support Vector Machines meet Goldfarb Cubes

Bernd Gaertner
ETH Zurich

Abstract: Support vector machines are convex quadratic programs that have become a very popular tool in machine learning to classify data items into "positives" and "negatives", after a training phase with items whose classification is known. A parameter controls how accurately the training items are taken into account; in specific applications, the right choice of the parameter is often obtained by experiments. In a seminal paper, Hastie et al. have suggested to facilitate such experiments by computing the entire solution path, i.e. the solution of the parameterized quadratic program as an explicit (piecewise linear) function of the parameter.

In this talk, I will first give a basic introduction to support vector machines. Then I show that solution paths can be exponentially long, making the approach of Hastie et al. unsuitable in the worst case. I will argue that this result is not hard to obtain from a 30-year old construction by Goldfarb, after reinterpreting it as a class of parameterized linear programs with exponentially long solution paths.