Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

The definitions of Shannon entropy, Kullback-Leibler divergence and Fano mutual information were used by Shannon in operational form " to solve coding problems (compression and transmission). Other types of problem call on other notions, such as Fisher information for parametric estimation. Choosing a criterion such as entropy a priori is not necessarily optimal for solving a given problem.

The seminar focuses on measuring information leakage in cryptographic systems. Here, the notion of maximum a posteriori (MAP) is important, but classical entropy has no real operational definition. We then present a theory of " alpha-information " based on Rényi entropy, Arimoto conditional entropy and Sibson information. It encompasses classical information theory for alpha = 1, Hartley information for alpha = 0 and MAP for infinite alpha, while preserving essential data processing and Fano inequalities. These inequalities allow us to assess the limits of any attack based on measurements disclosed through a covert channel in relation to an a priori belief in secrecy (without access to the measurements).

Speaker(s)

Olivier Rioul

Professor, Telecom Paris