Belgium
Royal College of Belgium
On January 15, 2020, a lecture on: Multiscale models and convolutional neural networks.
Deep neural networks have made spectacular progress in solving complex problems such as image, sound and language recognition, or in physics. They have played an important role in the revival of artificial intelligence. However, we understand very little about how they work, which poses numerous problems of robustness and explicability.
Recognizing or classifying data means approximating phenomena that depend on a very large number of variables. The Combinatorics explosion of possibilities makes this problem extremely difficult, and potentially impossible. If it's possible to learn, it's because there's structure that limits complexity, which neural networks seem to be able to capture. Understanding this "architecture of complexity" calls on many branches of mathematics. Multiscale and parsimonious representations and the existence of symmetries play an important role, which will be discussed in connection with the wavelet transform. These problems are illustrated by applications in physics, classification and image generation