Touring sampling with pushforward maps
- URL: http://arxiv.org/abs/2311.13845v2
- Date: Tue, 20 Feb 2024 18:17:40 GMT
- Title: Touring sampling with pushforward maps
- Authors: Vivien Cabannes, Charles Arnal
- Abstract summary: This paper takes a theoretical stance to review and organize many sampling approaches in the generative modeling setting.
It might prove useful to overcome some of the current challenges in sampling with diffusion models.
- Score: 3.5897534810405403
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The number of sampling methods could be daunting for a practitioner looking
to cast powerful machine learning methods to their specific problem. This paper
takes a theoretical stance to review and organize many sampling approaches in
the ``generative modeling'' setting, where one wants to generate new data that
are similar to some training examples. By revealing links between existing
methods, it might prove useful to overcome some of the current challenges in
sampling with diffusion models, such as long inference time due to diffusion
simulation, or the lack of diversity in generated samples.
Related papers
- Self-Guided Generation of Minority Samples Using Diffusion Models [57.319845580050924]
We present a novel approach for generating minority samples that live on low-density regions of a data manifold.
Our framework is built upon diffusion models, leveraging the principle of guided sampling.
Experiments on benchmark real datasets demonstrate that our approach can greatly improve the capability of creating realistic low-likelihood minority instances.
arXiv Detail & Related papers (2024-07-16T10:03:29Z) - Provable Statistical Rates for Consistency Diffusion Models [87.28777947976573]
Despite the state-of-the-art performance, diffusion models are known for their slow sample generation due to the extensive number of steps involved.
This paper contributes towards the first statistical theory for consistency models, formulating their training as a distribution discrepancy minimization problem.
arXiv Detail & Related papers (2024-06-23T20:34:18Z) - GUIDE: Guidance-based Incremental Learning with Diffusion Models [3.046689922445082]
We introduce GUIDE, a novel continual learning approach that directs diffusion models to rehearse samples at risk of being forgotten.
Our experimental results show that GUIDE significantly reduces catastrophic forgetting, outperforming conventional random sampling approaches and surpassing recent state-of-the-art methods in continual learning with generative replay.
arXiv Detail & Related papers (2024-03-06T18:47:32Z) - Improved off-policy training of diffusion samplers [93.66433483772055]
We study the problem of training diffusion models to sample from a distribution with an unnormalized density or energy function.
We benchmark several diffusion-structured inference methods, including simulation-based variational approaches and off-policy methods.
Our results shed light on the relative advantages of existing algorithms while bringing into question some claims from past work.
arXiv Detail & Related papers (2024-02-07T18:51:49Z) - Fast Sampling via Discrete Non-Markov Diffusion Models [49.598085130313514]
We propose a discrete non-Markov diffusion model, which admits an accelerated reverse sampling for discrete data generation.
Our method significantly reduces the number of function evaluations (i.e., calls to the neural network), making the sampling process much faster.
arXiv Detail & Related papers (2023-12-14T18:14:11Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Efficient Multimodal Sampling via Tempered Distribution Flow [11.36635610546803]
We develop a new type of transport-based sampling method called TemperFlow.
Various experiments demonstrate the superior performance of this novel sampler compared to traditional methods.
We show its applications in modern deep learning tasks such as image generation.
arXiv Detail & Related papers (2023-04-08T06:40:06Z) - Reduce, Reuse, Recycle: Compositional Generation with Energy-Based Diffusion Models and MCMC [102.64648158034568]
diffusion models have quickly become the prevailing approach to generative modeling in many domains.
We propose an energy-based parameterization of diffusion models which enables the use of new compositional operators.
We find these samplers lead to notable improvements in compositional generation across a wide set of problems.
arXiv Detail & Related papers (2023-02-22T18:48:46Z) - Example-Based Sampling with Diffusion Models [7.943023838493658]
diffusion models for image generation could be appropriate for learning how to generate point sets from examples.
We propose a generic way to produce 2-d point sets imitating existing samplers from observed point sets using a diffusion model.
We demonstrate how the differentiability of our approach can be used to optimize point sets to enforce properties.
arXiv Detail & Related papers (2023-02-10T08:35:17Z) - Autoregressive Denoising Diffusion Models for Multivariate Probabilistic
Time Series Forecasting [4.1573460459258245]
We use diffusion probabilistic models, a class of latent variable models closely connected to score matching and energy-based methods.
Our model learns gradients by optimizing a variational bound on the data likelihood and at inference time converts white noise into a sample of the distribution of interest.
arXiv Detail & Related papers (2021-01-28T15:46:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.