from to

The lecture " Thinking, modeling and mastering computation " was given as part of the annual Computer Sciences chair. It was the very  lecture of this Chair, created following my previous lecture " Why and how the world is going digital ", itself given in 2007-2008 as part of the annual Technological Innovation Liliane Bettencourt Chair. The lecture was devoted to the fundamentals of automatic computation, the kind performed by computers large and small in all digitized devices, from mega-computers and networks to the microchips embedded in objects of all kinds. As I explained in 2007-2008, computing is a difficult art, if you want to distinguish between applications that really work and those that just about work. The latter are characterized by the presence of residual bugs inducing aberrant behavior, with more or less serious consequences depending on the type of system. To make applications that really work, we need to reason about the calculations they perform, using mathematical models, just as we reason about physical objects.

However, because of the size and complexity of circuits and programs, we quickly realize that reasoning on paper is not enough. We need to introduce the key notion of calculating about calculating, so that reasoning can be carried out or at least verified by the machines themselves, with the aim of automatic or assisted formal verification of programs and circuits. However, to be able to work in this way, we must first formalize the notion of calculation itself, or rather the notions of calculation, given the recent progress made in the discipline.

Program