Score-Based Diffusion meets Annealed Importance Sampling
- URL: http://arxiv.org/abs/2208.07698v2
- Date: Wed, 17 Aug 2022 11:39:34 GMT
- Title: Score-Based Diffusion meets Annealed Importance Sampling
- Authors: Arnaud Doucet, Will Grathwohl, Alexander G. D. G. Matthews, Heiko
Strathmann
- Abstract summary: Annealed Importance Sampling remains one of the most effective methods for marginal likelihood estimation.
We leverage recent progress in score-based generative modeling to approximate the optimal extended target distribution for AIS proposals.
- Score: 89.92133671626327
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: More than twenty years after its introduction, Annealed Importance Sampling
(AIS) remains one of the most effective methods for marginal likelihood
estimation. It relies on a sequence of distributions interpolating between a
tractable initial distribution and the target distribution of interest which we
simulate from approximately using a non-homogeneous Markov chain. To obtain an
importance sampling estimate of the marginal likelihood, AIS introduces an
extended target distribution to reweight the Markov chain proposal. While much
effort has been devoted to improving the proposal distribution used by AIS, by
changing the intermediate distributions and corresponding Markov kernels, an
underappreciated issue is that AIS uses a convenient but suboptimal extended
target distribution. This can hinder its performance. We here leverage recent
progress in score-based generative modeling (SGM) to approximate the optimal
extended target distribution for AIS proposals corresponding to the
discretization of Langevin and Hamiltonian dynamics. We demonstrate these
novel, differentiable, AIS procedures on a number of synthetic benchmark
distributions and variational auto-encoders.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Differentiable Annealed Importance Sampling Minimizes The Symmetrized Kullback-Leibler Divergence Between Initial and Target Distribution [10.067421338825545]
We show that DAIS minimizes the symmetrized Kullback-Leibler divergence between the initial and target distribution.
DAIS can be seen as a form of variational inference (VI) as its initial distribution is a parametric fit to an intractable target distribution.
arXiv Detail & Related papers (2024-05-23T17:55:09Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Resampling Gradients Vanish in Differentiable Sequential Monte Carlo
Samplers [8.122270502556374]
Annealed Importance Sampling (AIS) moves particles along a Markov chain from a tractable initial distribution to an intractable target distribution.
The recently proposed Differentiable AIS (DAIS) enables efficient optimization of the transition kernels of AIS and of the distributions.
We propose to extend DAIS by a resampling step inspired by Sequential Monte Carlo.
arXiv Detail & Related papers (2023-04-27T17:54:57Z) - Aligning Language Models with Preferences through f-divergence
Minimization [4.952674870169772]
f-DPG allows the use of any f-divergence to approximate any target distribution that can be evaluated.
We show that Jensen-Shannon divergence strikes a good balance between these objectives, and frequently outperforms forward KL divergence by a wide margin.
arXiv Detail & Related papers (2023-02-16T10:59:39Z) - Optimization of Annealed Importance Sampling Hyperparameters [77.34726150561087]
Annealed Importance Sampling (AIS) is a popular algorithm used to estimates the intractable marginal likelihood of deep generative models.
We present a parameteric AIS process with flexible intermediary distributions and optimize the bridging distributions to use fewer number of steps for sampling.
We assess the performance of our optimized AIS for marginal likelihood estimation of deep generative models and compare it to other estimators.
arXiv Detail & Related papers (2022-09-27T07:58:25Z) - Personalized Trajectory Prediction via Distribution Discrimination [78.69458579657189]
Trarimiy prediction is confronted with the dilemma to capture the multi-modal nature of future dynamics.
We present a distribution discrimination (DisDis) method to predict personalized motion patterns.
Our method can be integrated with existing multi-modal predictive models as a plug-and-play module.
arXiv Detail & Related papers (2021-07-29T17:42:12Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.