GENIE: Higher-Order Denoising Diffusion Solvers
- URL: http://arxiv.org/abs/2210.05475v1
- Date: Tue, 11 Oct 2022 14:18:28 GMT
- Title: GENIE: Higher-Order Denoising Diffusion Solvers
- Authors: Tim Dockhorn, Arash Vahdat, Karsten Kreis
- Abstract summary: Denoising diffusion models (DDMs) have emerged as a powerful class of generative models.
A forward diffusion process slowly perturbs the data, while a deep model learns to gradually denoise.
Solving the differential equation (DE) defined by the learnt model requires slow iterative solvers for high-quality generation.
We propose a novel higher-order solver that significantly accelerates synthesis.
- Score: 19.79516951865819
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising diffusion models (DDMs) have emerged as a powerful class of
generative models. A forward diffusion process slowly perturbs the data, while
a deep model learns to gradually denoise. Synthesis amounts to solving a
differential equation (DE) defined by the learnt model. Solving the DE requires
slow iterative solvers for high-quality generation. In this work, we propose
Higher-Order Denoising Diffusion Solvers (GENIE): Based on truncated Taylor
methods, we derive a novel higher-order solver that significantly accelerates
synthesis. Our solver relies on higher-order gradients of the perturbed data
distribution, that is, higher-order score functions. In practice, only
Jacobian-vector products (JVPs) are required and we propose to extract them
from the first-order score network via automatic differentiation. We then
distill the JVPs into a separate neural network that allows us to efficiently
compute the necessary higher-order terms for our novel sampler during
synthesis. We only need to train a small additional head on top of the
first-order score network. We validate GENIE on multiple image generation
benchmarks and demonstrate that GENIE outperforms all previous solvers. Unlike
recent methods that fundamentally alter the generation process in DDMs, our
GENIE solves the true generative DE and still enables applications such as
encoding and guided sampling. Project page and code:
https://nv-tlabs.github.io/GENIE.
Related papers
- Learning-Order Autoregressive Models with Application to Molecular Graph Generation [52.44913282062524]
We introduce a variant of ARM that generates high-dimensional data using a probabilistic ordering that is sequentially inferred from data.
We demonstrate experimentally that our method can learn meaningful autoregressive orderings in image and graph generation.
arXiv Detail & Related papers (2025-03-07T23:24:24Z) - Distributional Diffusion Models with Scoring Rules [83.38210785728994]
Diffusion models generate high-quality synthetic data.
generating high-quality outputs requires many discretization steps.
We propose to accomplish sample generation by learning the posterior em distribution of clean data samples.
arXiv Detail & Related papers (2025-02-04T16:59:03Z) - An Efficient Diffusion-based Non-Autoregressive Solver for Traveling Salesman Problem [21.948190231334088]
We propose DEITSP, a diffusion model with efficient iterations tailored for Traveling Salesman Problems.
We introduce a one-step diffusion model that integrates the controlled discrete noise addition process with self-consistency enhancement.
We also design a dual-modality graph transformer to bolster the extraction and fusion of features from node and edge modalities.
arXiv Detail & Related papers (2025-01-23T15:47:04Z) - Think While You Generate: Discrete Diffusion with Planned Denoising [10.797958380377509]
We introduce Discrete Diffusion with Planned Denoising (DDPD), a novel framework that separates the generation process into two models: a planner and a denoiser.
DDPD outperforms traditional denoiser-only mask diffusion methods, achieving superior results on language modeling benchmarks.
Notably, in language modeling, DDPD significantly reduces the performance gap between diffusion-based and autoregressive methods in terms of generative perplexity.
arXiv Detail & Related papers (2024-10-08T18:03:34Z) - Derivative-Free Guidance in Continuous and Discrete Diffusion Models with Soft Value-Based Decoding [84.3224556294803]
Diffusion models excel at capturing the natural design spaces of images, molecules, DNA, RNA, and protein sequences.
We aim to optimize downstream reward functions while preserving the naturalness of these design spaces.
Our algorithm integrates soft value functions, which looks ahead to how intermediate noisy states lead to high rewards in the future.
arXiv Detail & Related papers (2024-08-15T16:47:59Z) - Gaussian Mixture Solvers for Diffusion Models [84.83349474361204]
We introduce a novel class of SDE-based solvers called GMS for diffusion models.
Our solver outperforms numerous SDE-based solvers in terms of sample quality in image generation and stroke-based synthesis.
arXiv Detail & Related papers (2023-11-02T02:05:38Z) - Using Intermediate Forward Iterates for Intermediate Generator
Optimization [14.987013151525368]
Intermediate Generator Optimization can be incorporated into any standard autoencoder pipeline for the generative task.
We show applications of the IGO on two dense predictive tasks viz., image extrapolation, and point cloud denoising.
arXiv Detail & Related papers (2023-02-05T08:46:15Z) - DORE: Document Ordered Relation Extraction based on Generative Framework [56.537386636819626]
This paper investigates the root cause of the underwhelming performance of the existing generative DocRE models.
We propose to generate a symbolic and ordered sequence from the relation matrix which is deterministic and easier for model to learn.
Experimental results on four datasets show that our proposed method can improve the performance of the generative DocRE models.
arXiv Detail & Related papers (2022-10-28T11:18:10Z) - Highly Parallel Autoregressive Entity Linking with Discriminative
Correction [51.947280241185]
We propose a very efficient approach that parallelizes autoregressive linking across all potential mentions.
Our model is >70 times faster and more accurate than the previous generative method.
arXiv Detail & Related papers (2021-09-08T17:28:26Z) - Gotta Go Fast When Generating Data with Score-Based Models [25.6996532735215]
Current score-based models generate data very slowly due to the sheer number of score network evaluations required by numerical SDE solvers.
We devise an SDE solver with adaptive step sizes tailored to score-based generative models piece by piece.
Our solver requires only two score function evaluations, rarely rejects samples, and leads to high-quality samples.
arXiv Detail & Related papers (2021-05-28T19:48:51Z) - Score-Based Generative Modeling through Stochastic Differential
Equations [114.39209003111723]
We present a differential equation that transforms a complex data distribution to a known prior distribution by injecting noise.
A corresponding reverse-time SDE transforms the prior distribution back into the data distribution by slowly removing the noise.
By leveraging advances in score-based generative modeling, we can accurately estimate these scores with neural networks.
We demonstrate high fidelity generation of 1024 x 1024 images for the first time from a score-based generative model.
arXiv Detail & Related papers (2020-11-26T19:39:10Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z) - A game-theoretic approach for Generative Adversarial Networks [2.995087247817663]
Generative adversarial networks (GANs) are a class of generative models, known for producing accurate samples.
Main bottleneck for their implementation is that the neural networks are very hard to train.
We propose a relaxed forward-backward algorithm for GANs.
We prove that when the pseudogradient mapping of the game is monotone, we have convergence to an exact solution or in a neighbourhood of it.
arXiv Detail & Related papers (2020-03-30T17:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.