Relative Entropy Gradient Sampler for Unnormalized Distributions
- URL: http://arxiv.org/abs/2110.02787v1
- Date: Wed, 6 Oct 2021 14:10:38 GMT
- Title: Relative Entropy Gradient Sampler for Unnormalized Distributions
- Authors: Xingdong Feng, Yuan Gao, Jian Huang, Yuling Jiao, Xu Liu
- Abstract summary: Relative entropy gradient sampler (REGS) for sampling from unnormalized distributions.
REGS is a particle method that seeks a sequence of simple nonlinear transforms iteratively pushing the initial samples from a reference distribution into the samples from an unnormalized target distribution.
- Score: 14.060615420986796
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a relative entropy gradient sampler (REGS) for sampling from
unnormalized distributions. REGS is a particle method that seeks a sequence of
simple nonlinear transforms iteratively pushing the initial samples from a
reference distribution into the samples from an unnormalized target
distribution. To determine the nonlinear transforms at each iteration, we
consider the Wasserstein gradient flow of relative entropy. This gradient flow
determines a path of probability distributions that interpolates the reference
distribution and the target distribution. It is characterized by an ODE system
with velocity fields depending on the density ratios of the density of evolving
particles and the unnormalized target density. To sample with REGS, we need to
estimate the density ratios and simulate the ODE system with particle
evolution. We propose a novel nonparametric approach to estimating the
logarithmic density ratio using neural networks. Extensive simulation studies
on challenging multimodal 1D and 2D mixture distributions and Bayesian logistic
regression on real datasets demonstrate that the REGS outperforms the
state-of-the-art sampling methods included in the comparison.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Particle Denoising Diffusion Sampler [32.310922004771776]
Particle Denoising Diffusion Sampler (PDDS) provides consistent estimates under mild assumptions.
We demonstrate PDDS on multimodal and high dimensional sampling tasks.
arXiv Detail & Related papers (2024-02-09T11:01:35Z) - Sampling in Unit Time with Kernel Fisher-Rao Flow [0.0]
We introduce a new mean-field ODE and corresponding interacting particle systems (IPS) for sampling from an unnormalized target density.
The IPS are gradient-free, available in closed form, and only require the ability to sample from a reference density and compute the (unnormalized) target-to-reference density ratio.
arXiv Detail & Related papers (2024-01-08T13:43:56Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Denoising Diffusion Samplers [41.796349001299156]
Denoising diffusion models are a popular class of generative models providing state-of-the-art results in many domains.
We explore a similar idea to sample approximately from unnormalized probability density functions and estimate their normalizing constants.
While score matching is not applicable in this context, we can leverage many of the ideas introduced in generative modeling for Monte Carlo sampling.
arXiv Detail & Related papers (2023-02-27T14:37:16Z) - Unsupervised Learning of Sampling Distributions for Particle Filters [80.6716888175925]
We put forward four methods for learning sampling distributions from observed measurements.
Experiments demonstrate that learned sampling distributions exhibit better performance than designed, minimum-degeneracy sampling distributions.
arXiv Detail & Related papers (2023-02-02T15:50:21Z) - Unrolling Particles: Unsupervised Learning of Sampling Distributions [102.72972137287728]
Particle filtering is used to compute good nonlinear estimates of complex systems.
We show in simulations that the resulting particle filter yields good estimates in a wide range of scenarios.
arXiv Detail & Related papers (2021-10-06T16:58:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.