Salle 5, Site Marcelin Berthelot
Open to all
-

Abstract

Linear regression algorithms can be made very flexible by first introducing a change of variableΦ(x). After this change of variable, a linear regression can be rewritten from the values taken by the kernel k(x, x') = < Φ(x), Φ(x') > on the training data. The regression that minimizes the empirical error is calculated by solving a convex optimization problem.

The minimization of convex functions under constraints is at the heart of many learning problems. Convexity guarantees that the set of minima is convex. Moreover, there is a unique minimum if convexity is strict. Minimization of a function under inequality constraints can be expressed by defining a Lagrangian that associates dual variables with each of the constraints. These variables are called Lagrange multipliers. We show that a saddle point of the Lagrangian corresponds to a minimum that satisfies the constraints. If both the function being minimized and the constraints are convex, then the Lagrangian's saddle-point condition is necessary and sufficient to obtain a solution to the optimization problem. The Kuhn and Tucker conditions are obtained by canceling the partial derivatives of the Lagrangian with respect to the data, and ensuring that the derivatives with respect to the multipliers are negative.