GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver
- URL: http://arxiv.org/abs/2510.17699v1
- Date: Mon, 20 Oct 2025 16:14:38 GMT
- Title: GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver
- Authors: Aleksandr Oganov, Ilya Bykov, Eva Neudachina, Mishan Aliev, Alexander Tolmachev, Alexander Sidorov, Aleksandr Zuev, Andrey Okhotin, Denis Rakitin, Aibek Alanov,
- Abstract summary: We introduce a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches.<n>We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhances detail fidelity.
- Score: 120.67680383929081
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While diffusion models achieve state-of-the-art generation quality, they still suffer from computationally expensive sampling. Recent works address this issue with gradient-based optimization methods that distill a few-step ODE diffusion solver from the full sampling process, reducing the number of function evaluations from dozens to just a few. However, these approaches often rely on intricate training techniques and do not explicitly focus on preserving fine-grained details. In this paper, we introduce the Generalized Solver: a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches. We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhances detail fidelity. We call the resulting method the Generalized Adversarial Solver and demonstrate its superior performance compared to existing solver training methods under similar resource constraints. Code is available at https://github.com/3145tttt/GAS.
Related papers
- Test-Time Scaling with Diffusion Language Models via Reward-Guided Stitching [66.39914384073145]
We propose a self-consistency framework that turns cheap diffusion-sampled reasoning into a reusable pool of step-level candidates.<n>We find that step-level recombination is most beneficial on harder problems.<n>Our training-free framework improves average accuracy by up to 2 across six math and coding tasks.
arXiv Detail & Related papers (2026-02-26T11:08:39Z) - Parallel Diffusion Solver via Residual Dirichlet Policy Optimization [88.7827307535107]
Diffusion models (DMs) have achieved state-of-the-art generative performance but suffer from high sampling latency due to their sequential denoising nature.<n>Existing solver-based acceleration methods often face significant image quality degradation under a low-dimensional budget.<n>We propose the Ensemble Parallel Direction solver (dubbed as EPD-EPr), a novel ODE solver that mitigates these errors by incorporating multiple gradient parallel evaluations in each step.
arXiv Detail & Related papers (2025-12-28T05:48:55Z) - Equivariant Sampling for Improving Diffusion Model-based Image Restoration [25.06154860408637]
We introduce EquS, a DMIR method that imposes equivariant information through dual sampling trajectories.<n>To further boost EquS, we propose the Timestep-Aware Schedule (TAS) and introduce EquS$+$.<n>Our method is compatible with previous problem-agnostic DMIR methods and significantly boosts their performance without increasing computational costs.
arXiv Detail & Related papers (2025-11-13T04:56:53Z) - Distilling Parallel Gradients for Fast ODE Solvers of Diffusion Models [53.087070073434845]
Diffusion models (DMs) have achieved state-of-the-art generative performance but suffer from high sampling latency due to their sequential denoising nature.<n>Existing solver-based acceleration methods often face image quality degradation under a low-latency budget.<n>We propose the Ensemble Parallel Direction solver (dubbed as ours), a novel ODE solver that mitigates truncation errors by incorporating multiple parallel gradient evaluations in each ODE step.
arXiv Detail & Related papers (2025-07-20T03:08:06Z) - TADA: Improved Diffusion Sampling with Training-free Augmented Dynamics [40.75121059939763]
We introduce a new sampling method that is up to $186%$ faster than the current state of the art solver for comparative FID on ImageNet512.<n>The key to our method resides in using higher-dimensional initial noise, allowing to produce more detailed samples.
arXiv Detail & Related papers (2025-06-26T20:30:27Z) - ReGuidance: A Simple Diffusion Wrapper for Boosting Sample Quality on Hard Inverse Problems [10.698572109242434]
We devise a simple wrapper, ReGuidance, for boosting both the sample realism and reward achieved by training-free methods.<n>We evaluate our wrapper on hard inverse problems like large box in-painting and super-resolution with high upscaling.
arXiv Detail & Related papers (2025-06-12T17:55:17Z) - DGSolver: Diffusion Generalist Solver with Universal Posterior Sampling for Image Restoration [49.16449955997501]
bfDGr is a diffusion solver with universal posterior sampling.<n>Code and models will be available at https://github.com/MiliLab/DGr.
arXiv Detail & Related papers (2025-04-30T10:12:48Z) - Fast Samplers for Inverse Problems in Iterative Refinement Models [19.099632445326826]
We propose a plug-and-play framework for constructing efficient samplers for inverse problems.
Our method can generate high-quality samples in as few as 5 conditional sampling steps and outperforms competing baselines requiring 20-1000 steps.
arXiv Detail & Related papers (2024-05-27T21:50:16Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Distilling ODE Solvers of Diffusion Models into Smaller Steps [32.49916706943228]
We introduce Distilled-ODE solvers, a straightforward distillation approach grounded in ODE solver formulations.
Our method seamlessly integrates the strengths of both learning-free and learning-based sampling.
Our method incurs negligible computational overhead compared to previous distillation techniques.
arXiv Detail & Related papers (2023-09-28T13:12:18Z) - Improving Diffusion Models for Inverse Problems using Manifold Constraints [55.91148172752894]
We show that current solvers throw the sample path off the data manifold, and hence the error accumulates.
To address this, we propose an additional correction term inspired by the manifold constraint.
We show that our method is superior to the previous methods both theoretically and empirically.
arXiv Detail & Related papers (2022-06-02T09:06:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.