Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

Dimensionality reduction is at the heart of data modeling and analysis, whether to represent data x or functions of that data f(x). The aim is to build models with as few x or f variables as possible , known as parsimonious representation. This corresponds to Occam's razor principle in science and philosophy, "the simplest explanations are the best". This introductory lecture recalls these notions and explains why dimensionality reduction is an important step in the compression of signals x with as few bits as possible, but also for the suppression of noise added to an unknown signal x or for inverse problems where x must be estimated from incomplete and noisy measurements. This is also the case for classification or regression problems, where the aim is to approximate the function f(x) with a limited number of training data. In all cases, dimensionality reduction makes it possible to reduce the number of parameters to be estimated, such as those of a neural network for the approximation of f.

The lecture will approach this subject from three different but equivalent points of view: the existence of regularities, low-dimensional approximation and the construction of parsimonious representations. These three notions are the vertices of the RAP triangle (Regularity, Approximation, Parsimony) whose mathematical properties the lecture will study, with linear and non-linear approaches.