Amphithéâtre Maurice Halbwachs, Site Marcelin Berthelot
Open to all
-

Computational complexity is the study of the resources required to achieve specified computational goals. Perhaps because the subject had its roots in logic, it was decision problems that classically provided the focus for study, with the theory of NP-completeness being a notable highlight. Very soon, prompted by applications, the scope of computational complexity widened to encompass, for example, optimization problems.

Many computational tasks involve measurement in some form. Motivated by applications in statistical physics and Bayesian statistics, for example, computational complexity turned its attention to counting problems, widely interpreted. Counting problems can rarely be solved exactly, so it is more fruitful to examine the possibility of approximate counting, and the intimately related task of sampling (The computational complexity of sampling configurations of a physical model is tightly related to estimating its partition function) I will claim that the subject is at an interesting position: there are many notable successes to celebrate, but at the same time many open problems that appear embarrassingly simple remain to taunt us.

Speaker(s)

Mark Jerrum

Queen Mary, University of London