Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

The Metropolis-Hasting algorithm samples a probability distribution, defining an ergodic Markov chain, of which it is the only invariant measure. We begin by introducing the main properties of the transition probabilities of a Markov chain, which define a stochastic matrix. Examples of random walks are considered. A Markov chain is used to transport an initial probability distribution to the invariant measure of the chain. The chain is assumed to be reversible and to satisfy the detailed balance equation. Convergence to a single invariant measure is guaranteed by a theorem that proves the ergodicity of the chain, under conditions of irreducibility and aperiodicity. The Hasting Metropolis algorithm defines the transition probability of such a Markov chain, which is fitted with an acceptance or rejection step. The acceptance probability is adjusted to obtain a detailed balance equation, which guarantees that the chain converges to the correct invariant measure. This algorithm is very flexible, but convergence can take a long time when the probability distribution has multiple minima whose positions are not well known.