from to

Deep neural networks have spectacular applications in a wide variety of fields, including computer vision, speech understanding, natural language analysis, robotics, prediction of various physical phenomena, medical diagnostics and strategy games such as Go. This first lecture on neural networks presents their applications, the architectures of these networks, the algorithms for optimizing their parameters, and finally the mathematical issues surrounding optimization and the generalizability of neural networks. We'll see that known theorems only answer these questions in simplified cases, which are often far removed from the conditions under which neural networks are used. The mathematical understanding of deep neural networks therefore remains essentially an open problem. In addition to data challenges, the seminars are dedicated to specific applications of deep neural networks.

The lecture will cover the following topics in turn :

  • Neural network applications in vision, hearing, physics, natural language, etc.
  • Dimensionality reduction : symmetries, multi-scale decompositions and parsimony.
  • The origins of neural networks : cybernetics and the perceptron.
  • Universality of a two-layer network.
  • Approximating functions : the curse of dimensionality.
  • Approximations with multi-layer networks.
  • Learning a network : cost functions.
  • Stochastic gradient descent optimization.
  • Backpropagation algorithm.
  • Convolutional network architecture.
  • Multiscale analysis and wavelets.
  • Symmetries, invariants and parsimony in deep networks.

Program