CREST Working Papers Series No. 2021-14
by Enzo Dia and Jacques Melitz
FUNDAMENTAL LIMITS OF DEEP NEURAL NETWORK LEARNING – Helmut Bölcskei, ETH Zurich
SCHEDULE |
Thursday 06th January 2022 Tuesday 11th January 2022 Thursday 13th January 2022 Tuesday 18th January 2022 |
From 2 PM to 4:30 PM From 3 PM to 5:30 PM From 2 PM to 4:30 PM From 3 PM to 5:30 PM |
SUMMARY:
This short course develops the fundamental limits of deep neural network learning from first principle by characterizing what is possible if no constraints on the learning algorithm and on the amount of training data are imposed. Concretely, we consider Kolmogorov-optimal approximation through deep neural networks with the guiding theme being a relation between the complexity of the function (class) to be approximated and the complexity of the approximating network in terms of connectivity and memory requirements for storing the network topology and the associated quantized weights. The theory we develop educes remarkable universality properties of deep networks. Specifically, deep networks are optimal approximants for markedly different function classes such as affine (i.e., wavelet-like) systems and Weyl-Heisenberg systems. This universality is afforded by a concurrent invariance property of deep networks to time-shifts, scalings, and frequency-shifts. In addition, deep networks provide exponential approximation accuracy—i.e., the approximation error decays exponentially in the number of non-zero weights in the network—of the multiplication operation, polynomials, sinusoidal functions, certain smooth functions, and even one-dimensional oscillatory textures and fractal functions such as the Weierstrass function, the latter two of which do not have previously known methods achieving exponential approximation accuracy. We also show that in the approximation of sufficiently smooth functions finite-width deep networks require strictly smaller connectivity than finite-depth wide networks.
The mathematical concepts forming the basis of this theory, namely metric entropy, linear and nonlinear approximation theory, best M-term approximation, and the theory of frames, will all be developed in the course.
LITERATURE:
https://www.mins.ee.ethz.ch/pubs/p/deep-it-2019
https://www.mins.ee.ethz.ch/pubs/p/deep-approx-18
https://www.mins.ee.ethz.ch/pubs/p/frameschapter
How Serious is the Measurement-Error Problem in Risk-Aversion Tasks?
CREST Working Papers Series No. 2021-13
by Fabien Perez and Guillaume Hollard and Radu Vranceanu
Legislators in the Crossfire : The Effect of Transparency on Parliamentary Voting
CREST Working Papers Series No. 2021-12
Note IPP n°74: Réformes fiscales et faisabilité politique
Note écrite par Felix Bierbrauer, Pierre C. Boyer, Andrew Lonsdale, Andreas Peichl – Septembre 2021
Michelangelo Rossi receives the CCIA & CRESSE & CPI Award for his paper
Washington — The Computer & Communications Industry Association presented an award to a young researcher at the CRESSE 2021 Conference that took place in Crete Sept. 3-5.
CRESSE & CPI announced the award went to Michelangelo Rossi, Digital Economist and Assistant Professor at Télécom Paris, Institut Polytechnique de Paris. Rossi received the award for the Best Paper on the Digital Economy at the conference dinner Sunday. Rossi’s paper, “Competition and Reputation in a Congested Marketplace: Theory and Evidence from Airbnb” explored how San Francisco regulatory changes to Airbnb entry costs affect competition and the reputational incentives for hosts to exert effort.
https://ccianet.org/news/2021/09/ccia-sponsors-an-award-at-the-cresse-conference/
Dynamic assignment without money : Optimality of spot mechanisms
CREST Working Papers Series No. 2021-11
by Julien Combe and Vladyslav Nora and Olivier Tercieux
It’s a match! How Economists improve mechanisms for kidney exchange
An interview with Julien Combe by TWS Partners – Juillet 2021 – 37 mn
ICML 2021: Congratulations to our Researchers
Crest papers accepted at the international Conference on Machine Learning (ICML)
The Crest is pleased to present the work of its researchers and professors at the 38th International Conference on Machine Learning (ICML) being held this week (July 18-24, 2021).
The ICML conference is world-renowned for presenting and publishing cutting-edge research on all aspects of machine learning, and is one of the fastest growing AI conferences in the world.
Congratulations to our Researchers Marco Cuturi, Anna Korba, Vianney Perchet, Flore Sentenac, Meyer Scetbon (ENSAE Paris, Institut Polytechnique de Paris) an Romaric Gaudel (ENSAI Rennes).
https://www.hi-paris.fr/2021/07/10/hi-paris-at-icml-2021/
Note IPPn°73: Comprendre la stagnation séculaire
Note écrite par Jean-Baptiste Michau – Juillet 2021