Align Your Steps: Optimizing Sampling Schedules in Diffusion Models
- URL: http://arxiv.org/abs/2404.14507v1
- Date: Mon, 22 Apr 2024 18:18:41 GMT
- Title: Align Your Steps: Optimizing Sampling Schedules in Diffusion Models
- Authors: Amirmojtaba Sabour, Sanja Fidler, Karsten Kreis,
- Abstract summary: Diffusion models (DMs) have established themselves as the state-of-the-art generative modeling approach in the visual domain and beyond.
A crucial drawback of DMs is their slow sampling speed, relying on many sequential function evaluations through large neural networks.
We propose a general and principled approach to optimizing the sampling schedules of DMs for high-quality outputs.
- Score: 63.927438959502226
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Diffusion models (DMs) have established themselves as the state-of-the-art generative modeling approach in the visual domain and beyond. A crucial drawback of DMs is their slow sampling speed, relying on many sequential function evaluations through large neural networks. Sampling from DMs can be seen as solving a differential equation through a discretized set of noise levels known as the sampling schedule. While past works primarily focused on deriving efficient solvers, little attention has been given to finding optimal sampling schedules, and the entire literature relies on hand-crafted heuristics. In this work, for the first time, we propose a general and principled approach to optimizing the sampling schedules of DMs for high-quality outputs, called $\textit{Align Your Steps}$. We leverage methods from stochastic calculus and find optimal schedules specific to different solvers, trained DMs and datasets. We evaluate our novel approach on several image, video as well as 2D toy data synthesis benchmarks, using a variety of different samplers, and observe that our optimized schedules outperform previous hand-crafted schedules in almost all experiments. Our method demonstrates the untapped potential of sampling schedule optimization, especially in the few-step synthesis regime.
Related papers
- Multi-Agent Sampling: Scaling Inference Compute for Data Synthesis with Tree Search-Based Agentic Collaboration [81.45763823762682]
This work aims to bridge the gap by investigating the problem of data synthesis through multi-agent sampling.
We introduce Tree Search-based Orchestrated Agents(TOA), where the workflow evolves iteratively during the sequential sampling process.
Our experiments on alignment, machine translation, and mathematical reasoning demonstrate that multi-agent sampling significantly outperforms single-agent sampling as inference compute scales.
arXiv Detail & Related papers (2024-12-22T15:16:44Z) - Optimizing Few-Step Sampler for Diffusion Probabilistic Model [0.0]
Diffusion Probabilistic Models (DPMs) have demonstrated exceptional capability of generating high-quality and diverse images.
But their practical application is hindered by the intensive computational cost during inference.
arXiv Detail & Related papers (2024-12-14T10:47:52Z) - Arbitrary-steps Image Super-resolution via Diffusion Inversion [68.78628844966019]
This study presents a new image super-resolution (SR) technique based on diffusion inversion, aiming at harnessing the rich image priors encapsulated in large pre-trained diffusion models to improve SR performance.
We design a Partial noise Prediction strategy to construct an intermediate state of the diffusion model, which serves as the starting sampling point.
Once trained, this noise predictor can be used to initialize the sampling process partially along the diffusion trajectory, generating the desirable high-resolution result.
arXiv Detail & Related papers (2024-12-12T07:24:13Z) - Score-Optimal Diffusion Schedules [29.062842062257918]
An appropriate discretisation schedule is crucial to obtain high quality samples.
This paper presents a novel algorithm for adaptively selecting an optimal discretisation schedule.
We find that our learned schedule recovers performant schedules previously only discovered through manual search.
arXiv Detail & Related papers (2024-12-10T19:26:51Z) - Adaptive Non-Uniform Timestep Sampling for Diffusion Model Training [4.760537994346813]
As data distributions grow more complex, training diffusion models to convergence becomes increasingly intensive.
We introduce a non-uniform timestep sampling method that prioritizes these more critical timesteps.
Our method shows robust performance across various datasets, scheduling strategies, and diffusion architectures.
arXiv Detail & Related papers (2024-11-15T07:12:18Z) - $\textit{Jump Your Steps}$: Optimizing Sampling Schedule of Discrete Diffusion Models [16.738569359216438]
We present $textitJump Your Steps$ (JYS), a novel approach that optimize the allocation of discrete sampling timesteps by minimizing CDE without extra computational cost.
In experiments across image, music, and text generation, JYS significantly improves sampling quality.
arXiv Detail & Related papers (2024-10-10T09:44:25Z) - Accelerating Diffusion Sampling with Optimized Time Steps [69.21208434350567]
Diffusion probabilistic models (DPMs) have shown remarkable performance in high-resolution image synthesis.
Their sampling efficiency is still to be desired due to the typically large number of sampling steps.
Recent advancements in high-order numerical ODE solvers for DPMs have enabled the generation of high-quality images with much fewer sampling steps.
arXiv Detail & Related papers (2024-02-27T10:13:30Z) - Model ensemble instead of prompt fusion: a sample-specific knowledge
transfer method for few-shot prompt tuning [85.55727213502402]
We focus on improving the few-shot performance of prompt tuning by transferring knowledge from soft prompts of source tasks.
We propose Sample-specific Ensemble of Source Models (SESoM)
SESoM learns to adjust the contribution of each source model for each target sample separately when ensembling source model outputs.
arXiv Detail & Related papers (2022-10-23T01:33:16Z) - AutoSampling: Search for Effective Data Sampling Schedules [118.20014773014671]
We propose an AutoSampling method to automatically learn sampling schedules for model training.
We apply our method to a variety of image classification tasks illustrating the effectiveness of the proposed method.
arXiv Detail & Related papers (2021-05-28T09:39:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.