Example-Based Sampling with Diffusion Models
- URL: http://arxiv.org/abs/2302.05116v1
- Date: Fri, 10 Feb 2023 08:35:17 GMT
- Title: Example-Based Sampling with Diffusion Models
- Authors: Bastien Doignies and Nicolas Bonneel and David Coeurjolly and Julie
Digne and Lo\"is Paulin and Jean-Claude Iehl and Victor Ostromoukhov
- Abstract summary: diffusion models for image generation could be appropriate for learning how to generate point sets from examples.
We propose a generic way to produce 2-d point sets imitating existing samplers from observed point sets using a diffusion model.
We demonstrate how the differentiability of our approach can be used to optimize point sets to enforce properties.
- Score: 7.943023838493658
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Much effort has been put into developing samplers with specific properties,
such as producing blue noise, low-discrepancy, lattice or Poisson disk samples.
These samplers can be slow if they rely on optimization processes, may rely on
a wide range of numerical methods, are not always differentiable. The success
of recent diffusion models for image generation suggests that these models
could be appropriate for learning how to generate point sets from examples.
However, their convolutional nature makes these methods impractical for dealing
with scattered data such as point sets. We propose a generic way to produce 2-d
point sets imitating existing samplers from observed point sets using a
diffusion model. We address the problem of convolutional layers by leveraging
neighborhood information from an optimal transport matching to a uniform grid,
that allows us to benefit from fast convolutions on grids, and to support the
example-based learning of non-uniform sampling patterns. We demonstrate how the
differentiability of our approach can be used to optimize point sets to enforce
properties.
Related papers
- Stable generative modeling using Schrödinger bridges [0.22499166814992438]
We propose a generative model combining Schr"odinger bridges and Langevin dynamics.
Our framework can be naturally extended to generate conditional samples and to Bayesian inference problems.
arXiv Detail & Related papers (2024-01-09T06:15:45Z) - Touring sampling with pushforward maps [3.5897534810405403]
This paper takes a theoretical stance to review and organize many sampling approaches in the generative modeling setting.
It might prove useful to overcome some of the current challenges in sampling with diffusion models.
arXiv Detail & Related papers (2023-11-23T08:23:43Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Parallelised Diffeomorphic Sampling-based Motion Planning [30.310891362316863]
We propose Parallelised Diffeomorphic Sampling-based Motion Planning (PDMP)
PDMP transforms sampling distributions of sampling-based motion planners, in a manner akin to normalising flows.
PDMP is able to leverage gradient information of costs, to inject specifications, in a manner similar to optimisation-based motion planning methods.
arXiv Detail & Related papers (2021-08-26T13:15:11Z) - Oops I Took A Gradient: Scalable Sampling for Discrete Distributions [53.3142984019796]
We show that this approach outperforms generic samplers in a number of difficult settings.
We also demonstrate the use of our improved sampler for training deep energy-based models on high dimensional discrete data.
arXiv Detail & Related papers (2021-02-08T20:08:50Z) - The Bures Metric for Generative Adversarial Networks [10.69910379275607]
Generative Adversarial Networks (GANs) are performant generative methods yielding high-quality samples.
We propose to match the real batch diversity to the fake batch diversity.
We observe that diversity matching reduces mode collapse substantially and has a positive effect on the sample quality.
arXiv Detail & Related papers (2020-06-16T12:04:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.