ADVANCED METHODS IN COMPUTATIONAL ECONOMICS – Simon Scheidegger, HEC Lausanne
RISK SHARING AND (RE)INSURANCE – Mario Ghossoub, University of Waterloo
SCHEDULE |
Thursday 07th April 2022 Monday 11th April 2022 Thursday 14th April 2022 |
From 1:30 PM to 4:45 PM From 1:30 PM to 4:45 PM From 1:30 PM to 4:45 PM |
SUMMARY:
This course will deal with the state-of-the-art in the theory of Pareto-optimal (re)insurance design under model uncertainty and/or non-Expected-Utility preferences, as well as provide an introduction to Pareto optimality in problems of peer-to-peer collaborative insurance. Specifically:
- We will start with some background material on the theory of decision-making under uncertainty, from the classical work on Expected-Utility Theory (EUT) of von Neumann and Morgenstern, Savage, and De Finetti, to the more recent work on ambiguity and probability distortions (Quiggin, Yaari, Schmeidler, Gilboa, Amarante, Maccheroni-Marinacci).
- We will cover some required mathematical tools, such as non-additive measure theory, probability distortions, Choquet integration, the theory of equimeasurable rearrangements, as well as risk measures, their properties, and their representations. We will pay special attention to distortion risk measures and spectral risk measures.
- We will then formally introduce a general model of the insurance market, following the work of Carlier and Dana [8], and study the existence and characterization of Pareto optima in this general setup. As a special case, we will consider the classical formulation of the optimal (re)insurance problem due to Arrow, in the framework of EUT, as well as more recent work extending Arrow’s setting to situations of belief heterogeneity between the two agents.
- We will then proceed to formulating several problems that extend the insurance model above to more general setting with non-EUT preferences, distortion risk measures, or situations of model uncertainty.
- Finally, we will go over a brief introduction to conditional mean risk sharing and peer-to-peer collaborative insurance.
FUNDAMENTAL LIMITS OF DEEP NEURAL NETWORK LEARNING – Helmut Bölcskei, ETH Zurich
SCHEDULE |
Thursday 06th January 2022 Tuesday 11th January 2022 Thursday 13th January 2022 Tuesday 18th January 2022 |
From 2 PM to 4:30 PM From 3 PM to 5:30 PM From 2 PM to 4:30 PM From 3 PM to 5:30 PM |
SUMMARY:
This short course develops the fundamental limits of deep neural network learning from first principle by characterizing what is possible if no constraints on the learning algorithm and on the amount of training data are imposed. Concretely, we consider Kolmogorov-optimal approximation through deep neural networks with the guiding theme being a relation between the complexity of the function (class) to be approximated and the complexity of the approximating network in terms of connectivity and memory requirements for storing the network topology and the associated quantized weights. The theory we develop educes remarkable universality properties of deep networks. Specifically, deep networks are optimal approximants for markedly different function classes such as affine (i.e., wavelet-like) systems and Weyl-Heisenberg systems. This universality is afforded by a concurrent invariance property of deep networks to time-shifts, scalings, and frequency-shifts. In addition, deep networks provide exponential approximation accuracy—i.e., the approximation error decays exponentially in the number of non-zero weights in the network—of the multiplication operation, polynomials, sinusoidal functions, certain smooth functions, and even one-dimensional oscillatory textures and fractal functions such as the Weierstrass function, the latter two of which do not have previously known methods achieving exponential approximation accuracy. We also show that in the approximation of sufficiently smooth functions finite-width deep networks require strictly smaller connectivity than finite-depth wide networks.
The mathematical concepts forming the basis of this theory, namely metric entropy, linear and nonlinear approximation theory, best M-term approximation, and the theory of frames, will all be developed in the course.
LITERATURE:
https://www.mins.ee.ethz.ch/pubs/p/deep-it-2019
https://www.mins.ee.ethz.ch/pubs/p/deep-approx-18
https://www.mins.ee.ethz.ch/pubs/p/frameschapter