Energy based diffusion generator for efficient sampling of Boltzmann
distributions
- URL: http://arxiv.org/abs/2401.02080v1
- Date: Thu, 4 Jan 2024 06:03:46 GMT
- Title: Energy based diffusion generator for efficient sampling of Boltzmann
distributions
- Authors: Yan Wang, Ling Guo, Hao Wu, Tao Zhou
- Abstract summary: We introduce a novel sampler called the energy based diffusion generator for generating samples from arbitrary target distributions.
The sampling model employs a structure similar to a variational autoencoder, utilizing a decoder to transform latent variables into random variables approximating the target distribution.
- Score: 12.951437957863275
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a novel sampler called the energy based diffusion generator for
generating samples from arbitrary target distributions. The sampling model
employs a structure similar to a variational autoencoder, utilizing a decoder
to transform latent variables from a simple distribution into random variables
approximating the target distribution, and we design an encoder based on the
diffusion model. Leveraging the powerful modeling capacity of the diffusion
model for complex distributions, we can obtain an accurate variational estimate
of the Kullback-Leibler divergence between the distributions of the generated
samples and the target. Moreover, we propose a decoder based on generalized
Hamiltonian dynamics to further enhance sampling performance. Through empirical
evaluation, we demonstrate the effectiveness of our method across various
complex distribution functions, showcasing its superiority compared to existing
methods.
Related papers
- New algorithms for sampling and diffusion models [0.0]
We introduce a novel sampling method for known distributions and a new algorithm for diffusion generative models with unknown distributions.
Our approach is inspired by the concept of the reverse diffusion process, widely adopted in diffusion generative models.
arXiv Detail & Related papers (2024-06-14T02:30:04Z) - Weak Generative Sampler to Efficiently Sample Invariant Distribution of Stochastic Differential Equation [8.67581853745823]
Current deep learning-based method solves the stationary Fokker--Planck equation to determine the invariant probability density function in form of deep neural networks.
We introduce a framework that employs a weak generative sampler (WGS) to directly generate independent and identically distributed (iid) samples.
Our proposed loss function is based on the weak form of the Fokker--Planck equation, integrating normalizing flows to characterize the invariant distribution.
arXiv Detail & Related papers (2024-05-29T16:41:42Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Convergence Analysis of Flow Matching in Latent Space with Transformers [7.069772598731282]
We present theoretical convergence guarantees for ODE-based generative models, specifically flow matching.
We use a pre-trained autoencoder network to map high-dimensional original inputs to a low-dimensional latent space, where a transformer network is trained to predict the velocity field of the transformation from a standard normal distribution to the target latent distribution.
arXiv Detail & Related papers (2024-04-03T07:50:53Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Iterated Denoising Energy Matching for Sampling from Boltzmann Densities [109.23137009609519]
Iterated Denoising Energy Matching (iDEM)
iDEM alternates between (I) sampling regions of high model density from a diffusion-based sampler and (II) using these samples in our matching objective.
We show that the proposed approach achieves state-of-the-art performance on all metrics and trains $2-5times$ faster.
arXiv Detail & Related papers (2024-02-09T01:11:23Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Generative Diffusion From An Action Principle [0.0]
We show that score matching can be derived from an action principle, like the ones commonly used in physics.
We use this insight to demonstrate the connection between different classes of diffusion models.
arXiv Detail & Related papers (2023-10-06T18:00:00Z) - Unsupervised Learning of Sampling Distributions for Particle Filters [80.6716888175925]
We put forward four methods for learning sampling distributions from observed measurements.
Experiments demonstrate that learned sampling distributions exhibit better performance than designed, minimum-degeneracy sampling distributions.
arXiv Detail & Related papers (2023-02-02T15:50:21Z) - An Energy-Based Prior for Generative Saliency [62.79775297611203]
We propose a novel generative saliency prediction framework that adopts an informative energy-based model as a prior distribution.
With the generative saliency model, we can obtain a pixel-wise uncertainty map from an image, indicating model confidence in the saliency prediction.
Experimental results show that our generative saliency model with an energy-based prior can achieve not only accurate saliency predictions but also reliable uncertainty maps consistent with human perception.
arXiv Detail & Related papers (2022-04-19T10:51:00Z) - Relative Entropy Gradient Sampler for Unnormalized Distributions [14.060615420986796]
Relative entropy gradient sampler (REGS) for sampling from unnormalized distributions.
REGS is a particle method that seeks a sequence of simple nonlinear transforms iteratively pushing the initial samples from a reference distribution into the samples from an unnormalized target distribution.
arXiv Detail & Related papers (2021-10-06T14:10:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.