ANR Grants


The French National Research Agency (ANR)

The French National Research Agency (ANR) is a public administrative institution under the authority of the French Ministry of Higher Education, Research and Innovation. The agency funds project-based research carried out by public operators cooperating with each other or with private companies.
The Agency’s missions, defined in the decree of 1 August 2006 amended on 24 March 2014, are:

  • To fund and promote the development of basic and targeted research, technological innovation, technology transfer and public-private partnerships
  • To implement the Work Programme approved by the French Minister of Research, following consultation with the supervisory ministers of France’s research bodies and public higher education institutions
  • To manage major government investment programmes in the fields of higher education and research and to oversee their implementation
  • To strengthen scientific cooperation across Europe and worldwide by aligning its Work Programme with European and international initiatives
  • To analyse trends in research offering and assess the impact of the funding it allocates on scientific output in France

ANR Fundings

Explore the flourishing realm of research at the CREST laboratory, propelled by the crucial financial support of the National Research Agency (ANR). As a major player in the French research landscape, the ANR plays a pivotal role in funding a variety of innovative projects within CREST. These projects, a testament to the creativity and insight of our researchers, push the boundaries of knowledge in diverse fields. Dive into our directory of ANR-supported projects and discover how strategic funding can catalyze scientific breakthroughs that enrich our understanding of the world around us.

List of CREST’s ongoing ANR projects

Author(s)
Geoffrey Barrows

Year
2022/2026

Submission summary
A large body of research indicates that air pollution affects human health and productivity (see Graff Zivin & Neidell (2013) for review). If these pollution-induced health shocks adversely affect labor productivity, then standard microeconomic theory suggests they should increase costs and prices, and lower gross output, revenues, profits, wages, consumer surplus, and total social welfare, depending on structural elements of supply and demand. While there exists a nascent literature that aims to connect pollution-induced productivity shocks to economic outcomes (see for example Fu et al., 2017 and Dechezlepre?tre et al., 2019), there is still very little work on the effect of these shocks on the operations of firms. To what extent do firms’ costs depend on local air pollution concentrations? Do firms pass the cost of pollution shocks on to consumers and workers? And if so, do these costs cross national boundaries? The goal of PRODPOLU is to link geo-localized data on French manufacturing plants together with detailed information on plant and firm-level outcomes and high spatial resolution pollution data to study for the first time the productivity, price, wage, and output effects of air pollution in a unified empirical framework.

URL
https://anr.fr/Project-ANR-22-CE26-0008

Author(s)
Anna Korba

Year
2022/2025

Submission summary
An important problem in machine learning and computational statistics is to sample from an intractable target distribution. In Bayesian inference for instance, the latter corresponds to the posterior distribution of the parameters, which is known only up to an intractable normalisation constant, and is needed for predictive inference. In deep learning, optimizing the parameters of a big neural network can be seen as the search for an optimal distribution over the parameters of the network.
This sampling problem can be cast as the optimization of a dissimilarity (the loss) functional, over the space of probability measures. As in optimization, a natural idea is to start from an initial distribution and apply a descent scheme for this problem. In particular, one can leverage the geometry of Optimal transport and consider Wasserstein gradient flows, that find continuous path of probability distributions decreasing the loss functional. Different algorithms to approximate the target distribution result from the choice of a loss functional, a time and space discretization; and results in practice to the simulation of interacting particle systems. This optimization point of view has recently led to new algorithms for sampling, but has also shed light on the analysis of existing schemes in Bayesian inference or neural networks optimization.
However, many theoretical and practical aspects of these approaches remain unclear. First, their non asymptotic properties quantifying the quality of the approximate distribution at a finite time and for a finite number of particles. Second, their convergence in the case where the target is not log-concave (which is analog to the non-convex optimization setting). Motivated by the machine learning applications mentioned above, the goal of this project is to investigate these questions, by leveraging recent techniques from the optimization, optimal transport, and partial differential equations literature.

URL
https://anr.fr/Project-ANR-22-CE23-0030

Author(s)
Victor-Emmanuel Brunel

Year
2022/2024

Submission summary
Riemannian data are ubiquitous in modern statistics and machine learning: Low-rank matrix completion, dictionary learning, matrix factorization, computer vision, shape statistics, optimal transport, etc. Moreover, in an era where extraordinarily rich, yet also complex and heterogeneous datasets become available to practitioners, it is becoming urgent to develop tools for extracting information that is as accurate, robust, computationally tractable and that are adapted to the geometry of the data. By exploring and further developing the interplays between geometry, curvature, probability theory and data analysis, DataGC will set the grounds for a non-asymptotic and non-parametric theory of location estimation on Riemannian manifolds. We will focus on barycenters, centrality regions and supports of probability distributions, which are paramount in descriptive statistics, data visualization and statistical inference.

URL
https://anr.fr/Project-ANR-22-ERCS-0015

Author(s)
Etienne Ollion

Year
2021/2025

Submission summary
In a context of destabilization of the public space linked to digital technology, the MEDIALEX project aims to renew the understanding of the dynamics of influence between parliamentary, media and public agendas. Its main objective is to better understand how parliamentarians, the media and the public influence each other in the definition of priority topics for public debate. To achieve this, the project intends to develop new computational methods to track statements in different layers of the public space. By bringing together sociologists, political scientists, computer scientists and computational linguistics researchers, this interdisciplinary project aims to (1) understand the dynamics of influence between the agendas of parliamentarians, the media and the public, (2) develop original methods to identify media events and reported utterances in large heterogeneous corpora, and (3) study the effects of the digitization of the public space on the legislator’s ability to impose the topics of public discussion.

The project mobilizes methods from the computational social sciences, taking advantage of new analytical frameworks that reconcile social science approaches with new computational tools. The methodological challenge of the project consists in developing methods from automatic language processing, in order to allow the identification of themes that cross the public debate in a more refined way and on a larger scale. These methods concern the identification of events on Twitter, on the one hand, and the identification of discourses reported in voluminous and heterogeneous corpora (newspapers, television, radio, parliamentary questions and debates, Twitter and Facebook).

The scientific program of MEDIALEX is divided into four work packages. The first one gathers the tasks of coordination, corpus management and dissemination (WP1). The three other work packages explore the influence of parliamentary, media and public agendas in three complementary ways. WP2 considers influence in a structural way, aiming at identifying over the long term which large category of actors (parliamentarians, media, public) manages to impose priority topics of attention on the others. WP3 considers influence at a finer scale, by studying the mechanisms of circulation of discourses between parliamentary, media and public spaces. Finally, WP4 focuses on the interpretations that are produced by the media and the public from parliamentary work (WP4).

The vocation of MEDIALEX is essentially scientific, but given its object and the techniques implemented, the project aims to contribute to public debate. Through “datasprints” and “workshops”, the project intends to involve actors and experts around new ways of representing the public space and political activity. MEDIALEX’s approach is thus in line with the major concerns of our societies regarding the role of Parliament, the production of public policies, the role of the media and the renewal of the forms of democracy.

URL
https://anr.fr/Project-ANR-21-CE38-0016

Author(s)
Anna Simoni

Year
2021/2025

Submission summary
Economic models for policy evaluation and labor markets often imply restrictions on observable and unobservable quantities, and on a structural parameter that are written in the form of moment conditions (MCs). The structural parameter of the MC model has a causal interpretation and the social planner wants to know its value in order to decide the policy to undertake. This project develops Bayesian causal inference and prediction for this type of conditional and unconditional MC models by making minimal assumptions on the data distribution. Our procedure is based on the Exponential Tilted Empirical Likelihood and we will show it is valid for both Bayesian and frequentist inference. Estimating causal effects is important in socio-economic situations of scarce resources in order to know the best treatment that has to be administrated to achieve a given goal. In addition to theoretical econometric tools we will provide the computational tools to easily implement our procedure.

URL
https://anr.fr/Project-ANR-21-CE26-0003

Author(s)
Caroline Hillairet

Year
2021/2025

Submission summary
In presence of abrupt (financial crisis or epidemics) or long-term (environmental or demographic) changes, one needs to use dynamic tools, to detect such changes from observable data, and to re-estimate models and risk quantification parameters, based on a dynamic and long-term view. Classical decision theory relies on a backward approach with given deterministic utility criteria. The drawbacks are twofold: first it does not incorporate any changes in the agents’ preferences, or any uncertain evolution of the environment variables. Furthermore, it leads to time-inconsistency and to optimal choices that depend on a fixed time-horizon related to the optimization problem. The framework of dynamic utilities is adapted to solve the issues raised above, by taking into account various risks and by proposing long-term, time-coherent policies. Dynamic utilities allow us to define adaptive strategies adjusted to the information flow, in non-stationary and uncertain environment. Therefore, the dynamic preferences framework provides a general and flexible framework in order to evaluate the impacts of short and long-term changes and to combine various risk parameters. Members of the team have worked since several years on this notion of dynamic utilities and they are now recognized as experts on this field.
In a complex and random environment, decision rules cannot be based on too simple criteria, and some economic approximations lead to optimal choices that are based on linear, or at best quadratic, cost-benefit analysis over time, and which can result in an underestimation of extreme risks. A general stochastic formulation and numerical estimation is useful to question the robustness of the theory.
The aim of this research project consists in proposing efficient numerical methods based on this theoretical framework. The main objectives are optimal detection of tendency changes in the environment, and optimization of economic actors’ decisions using dynamic preference criteria.
1) First, we aim at simulating dynamic utilities, which leads to various numerical challenges. They are related to non-linear forward second order HJB-Stochastic Partial Differential Equations, for which the standard numerical schemes are complex and unstable. We propose different methods for simulating these SPDEs, based on the stochastic characteristic method and neural networks.
2) We detect the transition point and study extreme scenarios in a context of multivariate risks. One possibility to overcome the short-term view of insurance and financial regulations is to consider hitting probabilities over a long-term or infinite horizon. In a multivariate setting, one quickly faces problems of estimation of the dependence structure between risks, as well as heavy computation times. It is non-trivial to detect changes in the risk processes as quickly as possible in presence of multiple sensors. We develop computing algorithms for hitting probabilities and other risk measures in a multivariate setting and in presence of changes in the parameters. We also obtain optimal risk mitigation techniques, using numerical methods.
3) We aim at calibrating dynamic utilities, that should be adapted to the evolving environment characterized by a multivariate source of risks. It consists in learning the decision maker’s preferences, and predict her behavior, based on an observed sequence of decisions. In the meantime, one also need to implement advanced statistical tools to calibrate the multivariate stochastic processes governing the environment.
4) We develop robust decision-making tools, for better handling model uncertainty in the worst case, including uncertainties on volatilities and correlations as well as jumps and moral hazard. We aim to study theoretical and numerical aspects for dynamic utilities under model uncertainty. It addresses the issues of moral hazard and ambiguity in model specification as well as in preferences and investment horizon specification.

URL
https://anr.fr/Project-ANR-21-CE46-0002

Author(s)
Nicolas Chopin

Year
2021/2025

Submission summary
EPIVASCAGE (EPIdemiology of VASCular AGEing) is a 4-year PRC project aiming to examine the association of baseline and vascular ageing progression for incident cardiovascular disease (CVD) and mortality in the community. To this end, EPIVASCAGE will conduct a deep and non-invasive phenotyping of the vascular ageing of large (carotid artery) and small-medium sized arteries (radial artery) and will examine radiomics features in these arterial segments. EPIVASCAGE will rely on the Paris Prospective Study III, an ongoing French community-based prospective study following n=10, 157 men and women aged 50-75 years since 2008. A total of 773 CVD events and 473 deaths are expected by the end of EPIVASCAGE in 2025. A budget of 666 k€ is requested to the ANR.

EPIVASCAGE will include 6 work packages (WP). WP1 will be dedicated to the coordination of EPIVASCAGE. In WP2, we will examine the predictive value of already existing and usable structural and functional carotid ageing biomarkers measured at baseline for incident CVD events (n=498 as of June 2020) (manuscript 1). This WP will also be dedicated to the validation of new CVD events, and access to the national health data hub as a complementary source of information is expected to be obtained by month 3. In WP3, we will perform a radiomics analysis on the raw and stored baseline carotid echo-tracking data containing images but also spectral data. Main steps will include data segmentation, image (texture, shape and gray scale) and spectral data extraction using pre-defined matrix and then data reduction (clustering methods). Then we will examine radiomics signatures and their association with incident CVD events (manuscript 2) together with the joint association of structural/functional carotid ageing biomarkers and radiomics signatures for incident CVD (manuscript 3). WP4 will be dedicated to the second PPS3 physical examination (Examination 2, January 2022 to December 2024, 7000 participants awaited, 75% participation rate expected) and data quality assessment. Carotid echo-tracking will be performed as per baseline assessment and an ultrasound of the radial artery will be newly added to assess vascular ageing of medium-small sized arteries. WP5 will be dedicated to carotid ageing progression using carotid ultrasound data measured at baseline and at examination 2. We will then identify actionable determinants of carotid ageing progression of the structural/functional biomarkers (manuscript 4) and of the radiomics features (delta radiomics, manuscript 5). WP6 will be dedicated to the vascular ageing of the small-medium sized radial artery using data collected at examination 2. Structural and functional biomarkers together with radiomics features will be extracted. Actionable determinants of structural/functional biomarkers (manuscript 6) and of the radiomics signatures (manuscript 7) will then be determined.
EPIVASCAGE will be led by JP Empana and his team, who is INSERM Research Director, Team leader (U970, Team 4 Integrative Epidemiology of cardiovascular diseases) and PI of the Paris Prospective Study III. EPIVASCAGE involves a multidisciplinary team of experts in CVD epidemiology (Partner 1, P1, Empana’s team), arterial wall mechanics (P2, P Boutouyrie, RM Bruno and F Poli, INSERM U970, team 7), high dimensional statistics (P3, N Chopin and Y Youssfi, Centre for Research in Economics and Statistics, CREST) and ultrasound imaging signal processing (P4, E Bianchini and F Faita, Institute of Clinical Physiology from the university of Pisa, Italy). A strong and established collaborative relationships already exists between team members.

The findings from EPIVASCAGE may support a new paradigm shift in the primary prevention of CVD by suggesting that large and small-medium sized arteries may be new and complementary targets for the primary prevention of CVD.

URL
https://anr.fr/Project-ANR-21-CE36-0010

Author(s)
Jean-Michel Zakoian

Year
2021/2025

Submission summary
The growing use of artificial intelligence and Machine Learning (ML) by banks and Fintech companies is one of the most significant technological changes in the financial industry over past decades. These new technologies hold great promise for the future of financial services, but also raise new challenges. In this context, the MLEforRisk project aims to provide better understanding of the usefulness of combining econometrics and ML for financial risk measurement. This project aims to provide a rigorous study of the benefits and limitations of these two approaches in the field of risk management, which is the core business of the financial industry. MLEforRisk is a multidisciplinary project in the fields of finance and financial econometrics which brings together junior and senior researchers in management, economics, applied mathematics, and data science.
The project has five methodological objectives related to credit, market, and liquidity risks. In the context of credit risk, ML methods are known to provide good classification performances. However, these methods often black boxes, which is particularly problematic for both clients and regulators. Thus, our objective is to develop hybrid approaches to credit risk modeling by combining econometrics and ML to overcome the trade-off between interpretability and predictive performance. At the same time, the use of ML in the field of credit risk has led to a debate on the potential discrimination biases which could be generated by these algorithms. Here, our objective is to develop statistical methods to test the algorithmic fairness of credit risk models and to mitigate these biases.
In the area of market risk, the project aims to combine ML techniques and advanced econometric modeling to improve the estimation of conditional risk measures associated to portfolio returns. Our objective is to propose new hybrid approaches for modeling the conditional variance matrix of returns or its inverse, called the precision matrix. Since these risk measures are the key input of trading strategies, the accuracy of their estimation is essential for the asset management industry. These estimation methods will be designed in the perspective of large portfolios for which the number of assets can exceed by far the number of time observations available to estimate the moments. A second objective is to take into account the asymmetry of the conditional distribution of returns when modeling the conditional risk by using ML methods.
Concerning liquidity risk, we observe that the development of alternative market indices and factorial investment significantly modify the dynamics of traded volumes on the markets by increasing dependencies and network effects. Our objective is to take these effects into account when measuring liquidity risk, while reducing the dimension of the parameter set used in the network with ML methods.
The MLEforRisk project aims at creating a doctoral training network for young researchers specialized in financial econometrics. It also aims to promote a reproducible research. All codes and data produced within the project will be archived on RunMyCode and the reproducibility of the numerical results will be certified by cascad, the first certification agency for scientific code and data.

URL
https://anr.fr/Project-ANR-21-CE26-0007

Author(s)
Edouard Challe

Year
2021/2024

Submission summary
Labor income risks, namely unemployent risk and wage risk, are a major concern for many workers, essentially because they are imperfectly insured (that is, insurance markets against idiosyncratic labor-income shocks are “incomplete”). As a result, those risks generate significant ex post inequalities across agents as well as an inefficient precautionary motive for saving, whose instability over the business cycle may greatly amplify economic crises. This source of inequality and aggregate instability is a recurrent phenomenon, and one that is dramatically illustrated by the ongoing worldwide economic collapse. The purpose of the project is to (i) quantify how aggregate shocks are amplified under incomplete markets, (ii) clarify the transmission channels of alternative economic policies in these circumstances; and (iii) design macroeconomic policies (monetary policy, fiscal policy, labor-market policies etc.) capable of optimally stabilizing economic crises in the presence of uninsured labor-income risk.

The project will be composed of two main parts: one that will focus on understanding the transmission mechanisms of aggregate shocks and policies under incomplete markets; and another part that will analyze the optimality of macroeconomic policies (i.e., monetary, fiscal, tax, labor-market policies) in response to aggregate shocks. The focus will be on the way different types of aggregate shocks alter the amount of idiosyncratic risk and rising inequality faced by the households. Given these propagation mechanisms, we will investigate the transmission and the optimality of alternative macro and insurance policies following sharp and brutal declines in economic activity, such as those triggered worldwide by the 2008 financial crisis or the current Covid-19 crisis. Both aspects of the study –the positive one and the normative one–, which will require the development of new models and methods, will be divided into several subprojects involving members of the research team and possibly outside co-authors.

To sum up, the purpose of the overall project is to revisit the transmission channel and optimality of a variety of policy instruments, under the assumption that individual risks are uninsured and households are heterogeneous. These policy tools include:
• conventional monetary policy (i.e., changes in nominal interest rates by the central bank);
• unconventional monetary policy (i.e., forward guidance about future policy rates; large-scale asset purchases; money-financed fiscal stimulus; etc.);
• transitory expansions in government spending or reductions in taxes;
• public debt policies (i.e., optimal public debt in the presence of liquidity demand);
• changes in the level, cyclicality and duration of unemployment benefit payments and short-time work arrangements;
• changes in the level, cyclicality and persistence of tariffs on traded goods.
This is a thriving area of macroeconomics in which several teams are currently competing worldwide. We aim at being one of these teams and would like to rely on the support of ANR to achieve this. We stress that we will pay special attention to the euro area, which is currently facing a number of macroeconomic policy challenges. Indeed, in the euro area monetary policy is centralized but constrained (by the zero lower bound on nominal interest rates), why fiscal policy is decentralized and, overall, non-cooperative. Unemployment insurance is also decentralized, hence with no cross-country risk sharing. Our project will thus help better understand how monetary and fiscal policies should be designed in a context where the institutional features of the euro area may aggravate the lack of insurance across households.

URL
https://anr.fr/Project-ANR-20-CE26-0018

Author(s)
Ivaylo Petev

Year
2020/2023

Submission summary
The motherhood penalty on wages and employment is a major source of gender inequality in the labour market, whose reduction is a stated aim of the European Parliament and the Council on the implementation of the principle of equal opportunities and equal treatment for men and women. We propose studying its causes, using large administrative data for France and Germany that allow us to link employers to employees and look at micro mechanisms in a comparative setup. Specifically, we aim to jointly study the role of firms, human capital depreciation and gender norms in shaping the labour market effects of children in different institutional and policy contexts.
Our research project has two main objectives: (1) Synchronizing and harmonizing of high quality administrative data that exist in relatively similar forms in Germany and France and preparing replication tools for the scientific community. (2) Using the resulting database to compare both countries with regard to family-related employment interruptions and subsequent maternal career and income developments.
As there are almost no registry datasets prepared for comparative cross-national research, the resulting data will be of high value to the research community. Comparatively analysing the drivers of the motherhood wage penalty in France and Germany illustrates the potential of this data and meaningfully contributes to the literature on gender inequality in the labour market.
Registry data allows us to be the first to look at how mothers sort into firms in different countries and to thus directly compare if labour market specific mechanisms through which childbirth affects economic gender inequalities differ according to national context.
France and Germany represent a compelling case study, as both countries followed different paths in how fast they integrated women into the labour force, in implementing family policies and in supporting of dual earner couples. France provides extensive all day childcare services enabling women a fast re-entry into the labour market and has a considerably lower motherhood wage and employment penalty than Germany.
To understand the mechanisms creating the motherhood penalty we use linked employeremployee to estimate exact employment and wage penalties for children on a year by year basis after childbirth. We aim to look at the extent to which wage and employment reductions are the result of mothers sorting into more low-wage, part-time oriented and gendersegregated firms. We expect that firm effects and maternity induced gender-workplace segregation, matter more in producing a high motherhood penalty in Germany, where a long detachment from the labour market and part-time work are more common. Finally, we aim to look at how local differences in gender norms affect the wage penalty, expecting to find a greater influence on careers in Germany, which has an institutional setting in which returning to full time work after birth is less of a societal norm.

URL
https://anr.fr/Project-ANR-20-FRAL-0007

Author(s)
Pierre Boyer

Year
2020/2024

Submission summary
Tax reforms: Finding the balance between efficiency and political feasibility

Questions linked to the design and implementation of redistributive tax policies have occupied a growing position on the public agenda over recent years. Moreover, the fiscal pressures brought upon by the current coronavirus crisis will ensure that these issues maintain considerable political significance for years to come.

New design of redistributive tax policies
The design of redistributive tax policies is an evergreen in the public discourse. Research on these questions has led to a well-developed “theory of optimal taxation” with seminal contributions by Mirrlees (1971), Piketty (1997), Diamond (1998) and Saez (2001; 2002). These contributions have in common that they characterize an optimal tax system that takes account of the behavioral responses of taxpayers and the public sector’s budget constraint. This theory is institution-free, which is both a strength and a weakness. It is a strength as it delivers clarity on how incentive effects shape welfare-maximizing taxes. It is a weakness because incentive effects are not the only forces that are relevant for the design of tax policies.
The research in this proposal develops a conceptual framework to analyze “Tax reforms and revolts in democracies”. It delivers a theory that takes account of an essential constraint that emerges in a democracy: tax policies have to find sufficient political support (i.e. being politically feasible) and this has implications for the design of tax systems (see, e.g., Martin and Gabay, 2018; Passarelli and Tabellini, 2017). The gilets jaunes manifestations are a reminder that important mobilizations can led to cancellation of announced tax reforms (see, Boyer et al. (2020a) for a descriptive and geographical analysis of the determinants of the movement). The current coronavirus crisis will put an unprecedented pressure on public finances. Raising revenues will be a priority once the virus recedes and political feasibility and fairness issues will be crucial. Indeed, tax systems have been redesigned after major events such as World Wars and our ability to take these constraints into account will be severely tested (see Scheve and Stasavage, 2016). Revolts could occur if tax reforms are not perceived to be satisfying fairness and political constraints (on fiscal revolts after the World War I see, e.g., Delalande, 2009).

The approach in “Tax reforms and revolts in democracies” will open new directions to scholars in social sciences, both theoretically and empirically. It allows to identify reforms that are appealing from a social welfare perspective and, moreover, are politically feasible.

URL
https://anr.fr/Project-ANR-20-CE41-0013

Author(s)
Peter Tankov

Year
2019/2023

Submission summary
The European energy sector is undergoing a major transition towards a carbon-free system. According to International Energy Agency, to limit the global temperature increase to 1.75°C by 2100 (Paris Agreement range midpoint), the energy sector must reach carbon neutrality by 2060. This can only be achieved through massive deployment of renewable energy at the scale not sustainable within the present structure of electricity markets, networks and incentives. On the one hand, such massive deployment will put excessive strain on the state budgets. On the other hand, the near-zero marginal cost of electricity from renewable sources will drive down electricity prices pushing conventional producers vital for system stability out of the market.

At the same time, new market mechanisms are introduced in Europe to guarantee network security under the conditions of increased penetration of renewable energy and multiplication of new electricity use patterns. These new opportunities create feedback effects on market prices affecting the business models of the agents and, in fine, the network security and the renewable penetration.

In this context, the EcoREES project is a multidisciplinary research partnership involving economists, applied mathematicians, statisticians and research engineers from the industry. The consortium includes Centre de Recherche en Economie et Statistique of ENSAE (CREST), Laboratoire d’Economie de Dauphine (LEDa), Centre de Mathématiques Appliquées of Ecole Polytechnique (CMAP), Laboratoire de Signaux et Systèmes of Centrale-Supélec (L2S) and the OSIRIS research department of Electricité de France. Out aim is to build game-theoretical models for simulating future electricity markets; use these models to understand the effect of market structure and incentives on the behavior of agents; and find optimal designs of market mechanisms and the optimal policies of the regulator achieving specific objectives in terms of CO2 emissions, renewable penetration and system stability in the future, highly renewable, European energy system.

To this end we shall (i) evaluate the business models and determine the strategies of individual agents in the new multi-market environment; (ii) understand the price formation as a result of interaction between agents; (iii) validate the model predictions through empirical analysis of electricity markets; (iv) create the first building blocks of an integrated modeling framework for simulating the prices and energy production, depending on the structure of markets and incentives. Unlike the models of agent-based computational economics based on full scale simulations of the economic system, we will adopt a semi-analytic framework of mean-field games, sufficiently complex to allow for realistic interactions between agents yet sufficiently tractable to allow for mathematical analysis of the model and fast computations.

The main innovations of the proposed project are : (i) Taking into account the very large dimension of the optimization problems faced by the players of the energy industry (probabilistic forecasts, multiple markets), in particular through the machine learning approaches; (ii) Integrated approach linking electricity price, renewable penetration and CO2 emission scenarios to operational strategies and entry/exit decisions of individual agents through game-theoretical models (mean-field games) taking into account the interactions between players; (iii) Systematic approach to define optimal subsidies/market design via principal-agent models and (iv) Econometric study of the effect of existing subsidies/market designs on industry dynamics.

The project results will be disseminated to energy sector stakeholders and policy makers via policy papers and specialized workshops.

URL
https://anr.fr/Project-ANR-19-CE05-0042

Author(s)
Victor-Emmanuel Brunel

Year
2019/2023

Submission summary
ADDS: Algorithms for Multi-Dimensional Data via Sketches

Massive data presents particularly vexing challenges for algorithmic processing: not only are most of the commonly encountered problems NP-hard, one cannot afford to spend too much running time or space. Even worse, for approximation algorithms, the dependency on the dimension is often exponential. Overcoming these challenges require the development of a new generation of algorithms and techniques.

Effectively addressing challenges of processing massive data through the notion of sketches: computing a constant-sized subset of the input data that captures key aspects of the entire data.

One key in effectively addressing these challenges is through the notion of sketches: extract a small subset—ideally constant-sized subset—of the input data that captures, approximately with respect to a given parameter epsilon, key aspects of the entire data. Given a family of optimization problems, the goal is to construct sketches whose size is independent of the size of the input data, while minimizing dependence on the dimension and the approximation parameter epsilon.

Sketches are related to more general succinct approximations of data, such as epsilon-nets and epsilon-approximations. An example of sketches are coresets, which are sketches such that solving the given problem on a coreset gives an approximate solution to the problem on the entire data.

While great progress has been achieved for non-geometric problems,
for many fundamental problems on geometric data, the construction and existence of near-optimal sketches remain open. Our research is divided into three parts, requiring expertise in statistics, computational geometry, learning, combinatorics, and algorithms. First, we consider the combinatorial properties of geometric data that are relevant to build compact sketches. Second, we consider the time and space complexities of constructing accurate sketches of data in high dimensions, based on the combinatorial and geometric understanding. Finally, we show how to use the small sketches in order to improve the accuracy and running time of optimization algorithms.

URL
https://anr.fr/Project-ANR-19-CE48-0005

Author(s)
Vianney Perchet

Year
2019/2023

Submission summary
Beyond Online Learning for better Decision making
There are currently three main barriers to a broader development of online learning.
1) The classical «one step, one decision, one reward« paradigm is unfit.
2) Optimality is defined with respect to worst-case generic lower bounds and
mechanics behind online learning are not fully understood.
3) Algorithms were designed in a non strategic or interactive environment.

General Objectives
Reactive ML algorithms adapt to data generating processes, typically do not require large computational power and, moreover, can be translated into offline (as opposed to online) algorithms if needed. Introduced in the 30s in the context of clinical trials, online ML algorithms have been gaining a lot of theoretical interest for the last 15 years because of their applications to the optimization of recommender systems, click through rates, planning in congested networks, to name just a few. However, in practice, such algorithms are not used as much as they should, because the traditional low-level modelling assumptions they are based upon are not appropriate, as it appears.

Instead of trying to complicate and generalise arbitrarily a framework unfit for potential applications, we will tackle this problem from another perspective. We will seek a better understanding of the simple original problem and extend it in the appropriate directions.
There are currently three main barriers to a broader development of online learning.
1) The classical «one step, one decision, one reward « paradigm is unfit.
2) Optimality is defined with respect to worst-case generic lower bounds and mechanics behind online learning are not fully understood.
3) Algorithms were designed in a non strategic or interactive environment.

URL
https://anr.fr/Project-ANR-19-CE23-0026

Author(s)
Pierre Boyer

Year
2020/2024

Submission summary
Middleclasses, taxation and democracy in a globalized world
This research project mobilizes the instruments of political economy and optimal tax theory to shed light on the link between inequalities, migration and democracy.

Redistribution, inequalities and populism
Part 1 explores the methods of redistribution of wealth, with a particular focus on the tax burden weighing on the middle classes. Part 2 aims to estimate the impact of the evolution, both real and as relayed by the media, of the socio-fiscal system on the vote of the extreme right. This analysis will allow us to shed light on the recent rise of the extreme right among the middle classes in the Western world.

URL
https://anr.fr/Project-ANR-19-CE41-0011