FUNDAMENTAL LIMITS OF DEEP NEURAL NETWORK LEARNING – Helmut Bölcskei, ETH Zurich



SCHEDULE
Thursday  06th January 2022
Tuesday    11th  January 2022
Thursday  13th  January 2022
Tuesday    18th  January 2022
From 2 PM  to 4:30 PM
From 3 PM  to 5:30 PM
From 2 PM  to 4:30 PM
From 3 PM  to 5:30 PM

SUMMARY:
This short course develops the fundamental limits of deep neural network learning from first principle by characterizing what is possible if no constraints on the learning algorithm and on the amount of training data are imposed. Concretely, we consider Kolmogorov-optimal approximation through deep neural networks with the guiding theme being a relation between the complexity of the function (class) to be approximated and the complexity of the approximating network in terms of connectivity and memory requirements for storing the network topology and the associated quantized weights. The theory we develop educes remarkable universality properties of deep networks. Specifically, deep networks are optimal approximants for markedly different function classes such as affine (i.e., wavelet-like) systems and Weyl-Heisenberg systems. This universality is afforded by a concurrent invariance property of deep networks to time-shifts, scalings, and frequency-shifts. In addition, deep networks provide exponential approximation accuracy—i.e., the approximation error decays exponentially in the number of non-zero weights in the network—of the multiplication operation, polynomials, sinusoidal functions, certain smooth functions, and even one-dimensional oscillatory textures and fractal functions such as the Weierstrass function, the latter two of which do not have previously known methods achieving exponential approximation accuracy. We also show that in the approximation of sufficiently smooth functions finite-width deep networks require strictly smaller connectivity than finite-depth wide networks.

The mathematical concepts forming the basis of this theory, namely metric entropy, linear and nonlinear approximation theory, best M-term approximation, and the theory of frames, will all be developed in the course.

LITERATURE:
https://www.mins.ee.ethz.ch/pubs/p/deep-it-2019
https://www.mins.ee.ethz.ch/pubs/p/deep-approx-18
https://www.mins.ee.ethz.ch/pubs/p/frameschapter

Michelangelo Rossi receives the CCIA & CRESSE & CPI Award for his paper


Washington — The Computer & Communications Industry Association presented an award to a young researcher at the CRESSE 2021 Conference that took place in Crete Sept. 3-5.

 CRESSE & CPI announced the award went to Michelangelo Rossi, Digital Economist and Assistant Professor at Télécom Paris, Institut Polytechnique de Paris. Rossi received the award for the Best Paper on the Digital Economy at the conference dinner Sunday. Rossi’s paper, “Competition and Reputation in a Congested Marketplace: Theory and Evidence from Airbnb” explored how San Francisco regulatory changes to Airbnb entry costs affect competition and the reputational incentives for hosts to exert effort.

 https://ccianet.org/news/2021/09/ccia-sponsors-an-award-at-the-cresse-conference/

ICML 2021: Congratulations to our Researchers


Crest papers accepted at the international Conference on Machine Learning (ICML)

The Crest is pleased to present the work of its researchers and professors at the 38th International Conference on Machine Learning (ICML) being held this week (July 18-24, 2021).

The ICML conference is world-renowned for presenting and publishing cutting-edge research on all aspects of machine learning, and is one of the fastest growing AI conferences in the world.

Congratulations to our Researchers Marco Cuturi, Anna Korba,  Vianney Perchet, Flore Sentenac, Meyer Scetbon (ENSAE Paris, Institut Polytechnique de Paris) an Romaric Gaudel (ENSAI Rennes).

https://www.hi-paris.fr/2021/07/10/hi-paris-at-icml-2021/