Data Science Seminar Series : Romain Couillet from Ecole Centrale Supélec
on the December 14, 2017
15:30 to 17:30
The Data Science Seminar Series bring high profile researchers to Université Grenoble Alpes to show the increasing role of data science in modern research. It is open to all researchers of Université Grenoble Alpes. For the 5th seminar, we will welcome Romain Couillet from Ecole Centrale Supélec, Univerty Paris-Saclay.
Romain Couillet from Ecole Centrale Supélec, Univerty Paris-Saclay will give a talk on “Neural networks and Random Matrix Theory”. The event will take place on Thursday 14 December at IMAG building. The event will be followed by a coffee break.
In this talk, some elementary properties and basics about random matrix theory will be introduced. Some emphasis will be put on the relevance of this theory in the field of statistical learning in high dimension. These notions will be illustrated on simple neural networks performance analysis. This will shed a new light on the impact of the non linear activation functions used in the network as well as some fundamental limits for elementary model settings in high dimensional spaces.
Related articles
- Application de la théorie des grandes matrices aléatoires à l'apprentissage pour les mégadonnées
- Une Analyse des Méthodes de Projections Aléatoires par la Théorie des Matrices Aléatoires
- A random matrix and concentration inequalities framework for neural networks analysis
- Harnessing neural networks: a random matrix approach
- Signal Processing in Large Systems:A New Paradigm
- Neural networks and random matrix theory (slides)
In this talk, some elementary properties and basics about random matrix theory will be introduced. Some emphasis will be put on the relevance of this theory in the field of statistical learning in high dimension. These notions will be illustrated on simple neural networks performance analysis. This will shed a new light on the impact of the non linear activation functions used in the network as well as some fundamental limits for elementary model settings in high dimensional spaces.
Related articles
- Application de la théorie des grandes matrices aléatoires à l'apprentissage pour les mégadonnées
- Une Analyse des Méthodes de Projections Aléatoires par la Théorie des Matrices Aléatoires
- A random matrix and concentration inequalities framework for neural networks analysis
- Harnessing neural networks: a random matrix approach
- Signal Processing in Large Systems:A New Paradigm
- Neural networks and random matrix theory (slides)
Published on December 18, 2017
Practical informations
Location
Imag building, salle des séminaires 1