Amphithéâtre Marguerite de Navarre, Site Marcelin Berthelot
Open to all
-

Abstract

Neural network learning has excellent generalization capabilities on problems as diverse as image, sound and language recognition, prediction in physics and medical diagnosis. This indicates that these problems have similar regularities. In 1962, a paper by Herbert Simons, " The architecture of complexity ", defined some generic properties of complex systems. The lecture began with a presentation of this article, which stems from a " cybernetics " analysis based on a model of a dynamic system including a feedback loop. The essential properties highlighted are the existence of a hierarchical organization, the regularity imposed by stability in the course of temporal evolution, and the separability of certain system components with weak interactions. These properties can be found in neurophysiological, biological, physical and symbolic systems.

In connection with this analysis, the lecture addressed the hierarchical, multi-scale architecture of convolutional neural networks, which are themselves structured by a priori information. Understanding these networks requires analysis of three   aspects: estimation, optimization and approximation. Estimation consists in identifying the best model in a class defined a priori by the architecture. The properties of estimators are briefly reviewed. Estimation can be performed by minimizing a risk function. This minimization is calculated using an optimization algorithm, in particular stochastic gradient descent, to adjust the network parameters.

Documents and media