Example-Based Sampling with Diffusion Models
- URL: http://arxiv.org/abs/2302.05116v1
- Date: Fri, 10 Feb 2023 08:35:17 GMT
- Title: Example-Based Sampling with Diffusion Models
- Authors: Bastien Doignies and Nicolas Bonneel and David Coeurjolly and Julie
Digne and Lo\"is Paulin and Jean-Claude Iehl and Victor Ostromoukhov
- Abstract summary: diffusion models for image generation could be appropriate for learning how to generate point sets from examples.
We propose a generic way to produce 2-d point sets imitating existing samplers from observed point sets using a diffusion model.
We demonstrate how the differentiability of our approach can be used to optimize point sets to enforce properties.
- Score: 7.943023838493658
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Much effort has been put into developing samplers with specific properties,
such as producing blue noise, low-discrepancy, lattice or Poisson disk samples.
These samplers can be slow if they rely on optimization processes, may rely on
a wide range of numerical methods, are not always differentiable. The success
of recent diffusion models for image generation suggests that these models
could be appropriate for learning how to generate point sets from examples.
However, their convolutional nature makes these methods impractical for dealing
with scattered data such as point sets. We propose a generic way to produce 2-d
point sets imitating existing samplers from observed point sets using a
diffusion model. We address the problem of convolutional layers by leveraging
neighborhood information from an optimal transport matching to a uniform grid,
that allows us to benefit from fast convolutions on grids, and to support the
example-based learning of non-uniform sampling patterns. We demonstrate how the
differentiability of our approach can be used to optimize point sets to enforce
properties.
Related papers
- Accelerated Diffusion Models via Speculative Sampling [89.43940130493233]
Speculative sampling is a popular technique for accelerating inference in Large Language Models.
We extend speculative sampling to diffusion models, which generate samples via continuous, vector-valued Markov chains.
We propose various drafting strategies, including a simple and effective approach that does not require training a draft model.
arXiv Detail & Related papers (2025-01-09T16:50:16Z) - Diffusing Differentiable Representations [60.72992910766525]
We introduce a novel, training-free method for sampling differentiable representations (diffreps) using pretrained diffusion models.
We identify an implicit constraint on the samples induced by the diffrep and demonstrate that addressing this constraint significantly improves the consistency and detail of the generated objects.
arXiv Detail & Related papers (2024-12-09T20:42:58Z) - Stable generative modeling using Schrödinger bridges [0.22499166814992438]
We propose a generative model combining Schr"odinger bridges and Langevin dynamics.
Our framework can be naturally extended to generate conditional samples and to Bayesian inference problems.
arXiv Detail & Related papers (2024-01-09T06:15:45Z) - Touring sampling with pushforward maps [3.5897534810405403]
This paper takes a theoretical stance to review and organize many sampling approaches in the generative modeling setting.
It might prove useful to overcome some of the current challenges in sampling with diffusion models.
arXiv Detail & Related papers (2023-11-23T08:23:43Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Reflected Diffusion Models [93.26107023470979]
We present Reflected Diffusion Models, which reverse a reflected differential equation evolving on the support of the data.
Our approach learns the score function through a generalized score matching loss and extends key components of standard diffusion models.
arXiv Detail & Related papers (2023-04-10T17:54:38Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Parallelised Diffeomorphic Sampling-based Motion Planning [30.310891362316863]
We propose Parallelised Diffeomorphic Sampling-based Motion Planning (PDMP)
PDMP transforms sampling distributions of sampling-based motion planners, in a manner akin to normalising flows.
PDMP is able to leverage gradient information of costs, to inject specifications, in a manner similar to optimisation-based motion planning methods.
arXiv Detail & Related papers (2021-08-26T13:15:11Z) - Oops I Took A Gradient: Scalable Sampling for Discrete Distributions [53.3142984019796]
We show that this approach outperforms generic samplers in a number of difficult settings.
We also demonstrate the use of our improved sampler for training deep energy-based models on high dimensional discrete data.
arXiv Detail & Related papers (2021-02-08T20:08:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.