Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

Bayesian statistical inference generally requires simulations of the statistical model parameter from its posteriordistribution , in order to approximate the integrals of interest by Monte Carlo. These simulations become tricky when the density of the a posteriori distribution is not numerically accessible, even at the cost of supplementation by auxiliary variables, because the panoply of simulation techniques (Devroye, 1986) is then no longer available.
This is the case, for example, with likelihoods where the normalization constant is implicit, as in Gibbs models. The ABC (approximate Bayesian computation) alternative proposed by Tavaré et al (1998) consists in simulating realizations of the statistical model parameter from its a prioridistribution , then simulating realizations of the statistical model and accepting only those parameters that produce realizations sufficiently close to the observed sample (Rubin, 1984). While this solution may seem rudimentary, replacing the sample in the inference by a set around that sample, it nevertheless enables statistical inference to be carried out in these situations of implicit likelihood, as in population genetics (Pudlo et al., 2015), and in this lecture we show how the justification of the ABC method has been refined over the last two decades to guarantee convergence of the a posteriori law if the distance to the observed sample decreases sufficiently fast with the size of that sample (Fearnhead & Li, 2018; Frazier et al., 2018).

Speaker(s)

Christian Robert

Professor at Paris-Dauphine University