from to
Spherical virtual bouquet composed with the help of a generative AI system (Dall-e via ChatGPT from Bing Nov 2023), with helical curves and semi-random growth, Lamiot.

The modeling of large-scale data is essentially probabilistic. Model learning, inference and the generation of new data require sampling these probability distributions. Impressive results have been achieved using neural networks to generate images, sounds, texts or physical fields. We will follow a path from the mathematical foundations to the algorithmic frontiers of random generation.

The lecture introduces the mathematical framework of Monte Carlo learning and statistical inference, as well as random generation algorithms. Consideration is given to Markov field models, which explain the conditional independence of variables, and are characterized by Gibbs energies. The lecture introduces sampling algorithms based on Markov chains, in particular the Metropolis-Hastings algorithm and Gibbs sampling. Sampling using the Langevin equation, derived from the Fokker-Planck equation, is then discussed. The lecture concludes with a presentation of score-diffusion generation algorithms, which enable complex probability distributions to be sampled by estimating the score using neural networks.

Program