talks

Conference presentations, seminars, and workshop talks on optimization, machine learning, and LLM research.

Invited Seminars

Morgan Stanley Machine Learning Research (December 2025) What is the lon-run behaviour of SGD — New York, USA Materials: slides

Inria Argo Team Seminar (December 2025) What is the lon-run behaviour of SGD — Grenoble, France Materials: slides

Université de Nice (December 2024) What is the long-run distribution of SGD? — Nice, France Materials: seminar pageslides

LPSM, Université de Paris (October 2024) What is the long-run distribution of SGD? — Paris, France Materials: seminar pageslides

Thoth Team Seminar, Inria (October 2024) What is the long-run distribution of SGD? — Grenoble, France Materials: seminar pageslides

SMAI MODE Conference (April 2024) Mirror methods: deterministic analysis — Marseille, France Materials: slides

Thoth Team Seminar, Inria (April 2021) Expressive power of invariant and equivariant graph neural networks — Grenoble, France Materials: slides

Conference and Workshop Presentations

ICML 2025Upcoming The global convergence time of stochastic gradient descent in non-convex landscapes Materials: posterpaper

ICML 2024Montreal, Canada What is the long-run distribution of stochastic gradient descent? A large deviations analysis Materials: posterpaper

NeurIPS@Paris 2023Paris, France Exact generalization guarantees for (regularized) Wasserstein distributionally robust models Materials: workshop pageslidespaper

FOCM 2023Paris, France Exact generalization guarantees for (regularized) Wasserstein distributionally robust models Materials: conference pageposterpaper

ICCOPT 2022Bethlehem, USA Mirror methods in stochastic settings Materials: slides

Workshop in Erice 2022Sicily, Italy Regularized WDRO and generalization guarantees Materials: workshop pageslides

COLT 2021Boulder, USA The last-iterate convergence rate of optimistic mirror descent in stochastic variational inequalities Materials: slidesposterpaper

MIPT-UGA Workshop 2021Online Expressive power of invariant and equivariant graph neural networks Materials: workshop pageslidespaper

AISTATS 2020Online A tight and unified analysis of gradient-based methods for a whole spectrum of differentiable games Materials: slidespaper