OptInfinite – Efficient infinite-dimensional optimization over measures
Optimization over probability measures has become a powerful approach for solving complex problems that involve probabilistic modeling. Advanced by the PI ans collaborators, it extends finite-dimensional optimization to the infinite-dimensional space of probability measures. This framework provides a principled way to address in particular the task of sampling, that refers to the process of drawing samples from a complex probability distribution, either to approximate it or generate new data. In Bayesian machine learning for instance, we can model uncertainty of the predictions by sampling a model’s parameters.

Similarly, in generative modeling, sampling is crucial for producing new data such as images or text. Existing methods struggle with complex measures, are difficult to evaluate, and are limited to Euclidean spaces, making them unsuitable for infinite-dimensional ones like functions or operators. Also, their lack of computational efficiency limites their use in sequential sampling tasks. My goal is to create a unified framework to design and evaluate efficient methods for sampling measures over general spaces.
Central to my approach is the use of tools from optimal transport and information geometry, which will help compare measures and design optimization dynamics. OptInfinite will advance optimization over measures by addressing two key challenges:
- Developing optimization objectives and geometries suited to the space of measures over general (possibly-infinite-dimensional) spaces to design tractable schemes and metrics, and
- Learning to solve advanced optimization problems, such as sequences of optimization tasks.
This framework will yield novel sampling methods, whose efficiency can be evaluated using optimization tools. It will also enable to compare measures and reveal where and how they are different. Ultimately, our work will provide a clear methodology and toolset applicable across multiple domains.
Funded by the European Union (ERC, OptInfinite, 101220742). Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Research Council Executive Agency. Neither the European Union nor the granting authority can be held responsible for them.
ontact

Anna Korba – Principal investigator
Anna Korba is an Assistant Professor at CREST-Groupe ENSAE-ENSAI, ENSAE Paris.
Her main line of research is machine learning. She has been working on kernel methods, optimal transport, optimization, particle systems and preference learning. At the moment, she is particularly interested in sampling and optimization methods.

Paul Caucheteux – PhD candidate
This PhD project investigates flow-based generative modeling on the space of probability measures. It develops both theoretical and practical methodologies to build and analyze flow-driven generative models. Applications and evaluations will focus primarily on visual tasks.

Marguerite Petit–Talamon – PhD candidate
This PhD project investigates variational inference, where the parametric family is the one of mixture of Gaussians. This family is powerful to approximate complicated posterior distributions, but raises technical challenges to produce efficient algorithms. Applications involve approximating a posterior distribution in Bayesian inference, which is useful to solve machine learning tasks (regression, classification) and predict while measuring the uncertainty of the models.
Publications
- Variational Inference with Mixtures of Isotropic Gaussians, with Marguerite Petit–Talamon, Marc Lambert in Advances in Neural Information Processing Systems (NeurIPS) 2025, 2025