Wasserstein distributionally robust optimization

Regularization schemes and generalization guarantees for Wasserstein DRO models.

Wasserstein Distributionally Robust Optimization (WDRO) has emerged as a powerful framework to design models robust to uncertainty in data distribution, addressing robustness and fairness in machine learning and operations research. Given n training samples, WDRO considers the worst-case expected loss over all distributions close to the empirical distribution in Wasserstein distance within a specified radius.

Entropic regularization in WDRO. Though WDRO has been successfully applied in various contexts, it remains computationally challenging and leads to poorly conditioned objectives. Inspired by the success of entropic regularization in optimal transport, we proposed regularizing the inner maximization problem of WDRO with an entropic penalty. This leads to a smoothed formulation that is easier to solve and use in wider machine learning contexts (ESAIM COCV).

Generalization and robustness guarantees. A key appeal of WDRO is its strong generalization guarantees: the WDRO objective provides an exact upper-bound on the true risk when samples are large enough. However, this approach suffers from the curse of dimensionality. We provided new generalization bounds for WDRO that don’t suffer from this curse, showing that the WDRO training objective provides an exact upper-bound on the true risk with high probability for general model classes, even under distribution shifts at inference time, without sample complexity scaling exponentially with data dimension (NeurIPS 2023, slides).

Early iterations were shared at the Erice 2022 workshop (slides), then refined for FOCM 2023 (poster) and NeurIPS@Paris 2023 (slides).

publications

  1. Regularization for Wasserstein distributionally robust optimization
    Waïss Azizian, Franck Iutzeler, and Jérôme Malick
    ESAIM: Control, Optimisation and Calculus of Variations, 2023
  2. Exact generalization guarantees for (regularized) Wasserstein distributionally robust models
    Waïss Azizian, Franck Iutzeler, and Jérôme Malick
    In NeurIPS, 2023
  3. skwdro: a library for Wasserstein distributionally robust machine learning
    Florian Vincent, Waïss Azizian, Franck Iutzeler, and 1 more author
    arXiv: 2410.21231, 2024