New algorithms for sampling and diffusion models
- URL: http://arxiv.org/abs/2406.09665v2
- Date: Wed, 10 Jul 2024 22:57:08 GMT
- Title: New algorithms for sampling and diffusion models
- Authors: Xicheng Zhang,
- Abstract summary: We introduce a novel sampling method for known distributions and a new algorithm for diffusion generative models with unknown distributions.
Our approach is inspired by the concept of the reverse diffusion process, widely adopted in diffusion generative models.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Drawing from the theory of stochastic differential equations, we introduce a novel sampling method for known distributions and a new algorithm for diffusion generative models with unknown distributions. Our approach is inspired by the concept of the reverse diffusion process, widely adopted in diffusion generative models. Additionally, we derive the explicit convergence rate based on the smooth ODE flow. For diffusion generative models and sampling, we establish a dimension-free particle approximation convergence result. Numerical experiments demonstrate the effectiveness of our method. Notably, unlike the traditional Langevin method, our sampling method does not require any regularity assumptions about the density function of the target distribution. Furthermore, we also apply our method to optimization problems.
Related papers
- G2D2: Gradient-guided Discrete Diffusion for image inverse problem solving [55.185588994883226]
This paper presents a novel method for addressing linear inverse problems by leveraging image-generation models based on discrete diffusion as priors.
To the best of our knowledge, this is the first approach to use discrete diffusion model-based priors for solving image inverse problems.
arXiv Detail & Related papers (2024-10-09T06:18:25Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
We propose a variational inference approach to sample from the posterior distribution for solving inverse problems.
We show that our method is applicable to standard signals in Euclidean space, as well as signals on manifold.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Improving Probabilistic Diffusion Models With Optimal Covariance Matching [27.2761325416843]
We introduce a novel method for learning the diagonal covariances.
We show how our method can substantially enhance the sampling efficiency, recall rate and likelihood of both diffusion models and latent diffusion models.
arXiv Detail & Related papers (2024-06-16T05:47:12Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.
We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.
Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Fast Sampling via Discrete Non-Markov Diffusion Models [49.598085130313514]
We propose a discrete non-Markov diffusion model, which admits an accelerated reverse sampling for discrete data generation.
Our method significantly reduces the number of function evaluations (i.e., calls to the neural network), making the sampling process much faster.
arXiv Detail & Related papers (2023-12-14T18:14:11Z) - Fast Diffusion EM: a diffusion model for blind inverse problems with
application to deconvolution [0.0]
Current methods assume the degradation to be known and provide impressive results in terms of restoration and diversity.
In this work, we leverage the efficiency of those models to jointly estimate the restored image and unknown parameters of the kernel model.
Our method alternates between approximating the expected log-likelihood of the problem using samples drawn from a diffusion model and a step to estimate unknown model parameters.
arXiv Detail & Related papers (2023-09-01T06:47:13Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Fast Sampling of Diffusion Models via Operator Learning [74.37531458470086]
We use neural operators, an efficient method to solve the probability flow differential equations, to accelerate the sampling process of diffusion models.
Compared to other fast sampling methods that have a sequential nature, we are the first to propose a parallel decoding method.
We show our method achieves state-of-the-art FID of 3.78 for CIFAR-10 and 7.83 for ImageNet-64 in the one-model-evaluation setting.
arXiv Detail & Related papers (2022-11-24T07:30:27Z) - An optimal control perspective on diffusion-based generative modeling [9.806130366152194]
We establish a connection between optimal control and generative models based on differential equations (SDEs)
In particular, we derive a Hamilton-Jacobi-Bellman equation that governs the evolution of the log-densities of the underlying SDE marginals.
We develop a novel diffusion-based method for sampling from unnormalized densities.
arXiv Detail & Related papers (2022-11-02T17:59:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.