Arrow Research search
Back to NeurIPS

NeurIPS 2025

Bootstrap Your Uncertainty: Adaptive Robust Classification Driven by Optimal-Transport

Conference Paper Main Conference Track Artificial Intelligence ยท Machine Learning

Abstract

Deep learning models often struggle with distribution shifts between training and deployment environments. Distributionally Robust Optimization (DRO) offers a promising framework by optimizing worst-case performance over a set of candidate distributions, which is called as the \emph{uncertainty set}. However, the efficacy of DRO heavily depends on the design of uncertainty set, and existing methods often perform suboptimally due to inappropriate and inflexible uncertainty sets. In this work, we first propose a novel perspective that casts entropy-regularized Wasserstein DRO as a dynamic process of distributional exploration and semantic alignment, both driven by optimal transport (OT). This unified viewpoint yields two key new techniques: \emph{semantic calibration}, which bootstraps semantically meaningful transport costs via inverse OT, and \emph{adaptive refinement}, which adjusts uncertainty set using OT-driven feedback. Together, these components form an exploration-and-feedback system, where the transport costs and uncertainty set evolve jointly during training, enabling the model to better adapt to potential distribution shifts. Moreover, we provide an in-depth analysis on this adaptive process and prove the theoretical convergence guarantee. Finally, we present our experimental results across diverse distribution shift scenarios, which demonstrate that our approach significantly outperforms existing methods, achieving state-of-the-art robustness.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
Annual Conference on Neural Information Processing Systems
Archive span
1987-2025
Indexed papers
30776
Paper id
520148354508672154