Arrow Research search
Back to NeurIPS

NeurIPS 2025

Adaptive Stochastic Coefficients for Accelerating Diffusion Sampling

Conference Paper Main Conference Track Artificial Intelligence ยท Machine Learning

Abstract

Diffusion-based generative processes, formulated as differential equation solving, frequently balance computational speed with sample quality. Our theoretical investigation of ODE- and SDE-based solvers reveals complementary weaknesses: ODE solvers accumulate irreducible gradient error along deterministic trajectories, while SDE methods suffer from amplified discretization errors when the step budget is limited. Building upon this insight, we introduce AdaSDE, a novel single-step SDE solver that aims to unify the efficiency of ODEs with the error resilience of SDEs. Specifically, we introduce a single per-step learnable coefficient, estimated via lightweight distillation, which dynamically regulates the error correction strength to accelerate diffusion sampling. Notably, our framework can be integrated with existing solvers to enhance their capabilities. Extensive experiments demonstrate state-of-the-art performance: at 5 NFE, AdaSDE achieves FID scores of $4. 18$ on CIFAR-10, $8. 05$ on FFHQ and $6. 96$ on LSUN Bedroom. Codes are available https: //github. com/WLU-wry02/AdaSDE.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
Annual Conference on Neural Information Processing Systems
Archive span
1987-2025
Indexed papers
30776
Paper id
142130327022479756