Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

The microcanonical ensembles envisaged by Boltzmann and formally introduced by Gibbs define a statistical model of a particle system by fixing the volume, energy and number of particles. The entropy of an isolated system increases with time. It is shown to reach its maximum when the probability distribution of microstates is uniform. Under the assumption of weak coupling, this allows us to introduce temperature as the variation of entropy relative to energy, for a system at equilibrium.

The second part of the lecture revisits the notion of entropy from the point of view of information theory, in connection with coding. Microcanonical sets are replaced by typical sets, where most of the support for a probability distribution is concentrated. We start with a probability distribution of a set of independent variables, whose probability density can be written as a product of the probability densities of each variable. Entropy is defined as the log expectation of the probability density. Typical sets are defined by the concentration of entropy around its mean value.

The asymptotic equipartition theorem demonstrates that the probability distribution in a typical set converges to a uniform distribution as the number of variables (dimension) increases. It is demonstrated in the case of independent variables taking a finite set of values. An optimal coding algorithm is deduced, whose average number of bits converges to the entropy as the dimension increases.