FUNDAMENTAL LIMITS OF DEEP NEURAL NETWORK LEARNING – Helmut Bölcskei, ETH Zurich



SCHEDULE
Thursday  06th January 2022
Tuesday    11th  January 2022
Thursday  13th  January 2022
Tuesday    18th  January 2022
From 2 PM  to 4:30 PM
From 3 PM  to 5:30 PM
From 2 PM  to 4:30 PM
From 3 PM  to 5:30 PM

SUMMARY:
This short course develops the fundamental limits of deep neural network learning from first principle by characterizing what is possible if no constraints on the learning algorithm and on the amount of training data are imposed. Concretely, we consider Kolmogorov-optimal approximation through deep neural networks with the guiding theme being a relation between the complexity of the function (class) to be approximated and the complexity of the approximating network in terms of connectivity and memory requirements for storing the network topology and the associated quantized weights. The theory we develop educes remarkable universality properties of deep networks. Specifically, deep networks are optimal approximants for markedly different function classes such as affine (i.e., wavelet-like) systems and Weyl-Heisenberg systems. This universality is afforded by a concurrent invariance property of deep networks to time-shifts, scalings, and frequency-shifts. In addition, deep networks provide exponential approximation accuracy—i.e., the approximation error decays exponentially in the number of non-zero weights in the network—of the multiplication operation, polynomials, sinusoidal functions, certain smooth functions, and even one-dimensional oscillatory textures and fractal functions such as the Weierstrass function, the latter two of which do not have previously known methods achieving exponential approximation accuracy. We also show that in the approximation of sufficiently smooth functions finite-width deep networks require strictly smaller connectivity than finite-depth wide networks.

The mathematical concepts forming the basis of this theory, namely metric entropy, linear and nonlinear approximation theory, best M-term approximation, and the theory of frames, will all be developed in the course.

LITERATURE:
https://www.mins.ee.ethz.ch/pubs/p/deep-it-2019
https://www.mins.ee.ethz.ch/pubs/p/deep-approx-18
https://www.mins.ee.ethz.ch/pubs/p/frameschapter