Abstract
The entropy rate of a Markov chain is calculated as a function of its transition matrix. It is shown that the Kullback-Liebler distance (relative entropy) always decreases for two probability distributions evolving along a Markov chain. If the stationary distribution of the chain is uniform, then we also demonstrate that entropy increases along a Markov chain (over time). This demonstrates the second principle of thermodynamics, under this assumption.
The second part of the lecture introduces the canonical sets of statistical physics. Instead of fixing the system's energy, we fix its temperature, so that the system's energy can fluctuate. Temperature is redefined as the variation of entropy relative to energy. We show that the probability distribution of energies is an exponential law called the Gibbs distribution.