Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

Recent years have seen a boom in neural network-based techniques for image synthesis. These methods are most often based on the minimization of a function, for which the minimizers are assumed to be the solutions of the synthesis problem. In this talk, we investigate, both theoretically and experimentally, an alternative framework for approaching this problem using analternating sampling and minimization scheme. First, we use results from information geometry to define a synthesis problem for which the solution is a maximum entropy distribution under expectation constraints. Samples of this distribution represent synthetic images. Next, we turn to the analysis of our method and show, using recent results from the Markov chain literature, that its error can be explicitly bounded with constants polynomially dependent on dimension, even in the non-convex setting. This includes the case where constraints are defined via a differentiable neural network.

Speaker(s)

Valentin De Bortoli

CNRS, École normale supérieure