A connection between Tempering and Entropic Mirror Descent
- URL: http://arxiv.org/abs/2310.11914v3
- Date: Sun, 16 Jun 2024 10:17:34 GMT
- Title: A connection between Tempering and Entropic Mirror Descent
- Authors: Nicolas Chopin, Francesca R. Crucinio, Anna Korba,
- Abstract summary: We establish that tempering SMC corresponds to entropic mirror descent applied to the reverse Kullback-Leibler divergence.
We derive adaptive tempering rules that improve over other alternative benchmarks in the literature.
- Score: 8.775514582692795
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper explores the connections between tempering (for Sequential Monte Carlo; SMC) and entropic mirror descent to sample from a target probability distribution whose unnormalized density is known. We establish that tempering SMC corresponds to entropic mirror descent applied to the reverse Kullback-Leibler (KL) divergence and obtain convergence rates for the tempering iterates. Our result motivates the tempering iterates from an optimization point of view, showing that tempering can be seen as a descent scheme of the KL divergence with respect to the Fisher-Rao geometry, in contrast to Langevin dynamics that perform descent of the KL with respect to the Wasserstein-2 geometry. We exploit the connection between tempering and mirror descent iterates to justify common practices in SMC and derive adaptive tempering rules that improve over other alternative benchmarks in the literature.
Related papers
- Policy Gradients for Optimal Parallel Tempering MCMC [0.276240219662896]
Parallel tempering is a meta-algorithm for Markov Chain Monte Carlo that uses multiple chains to sample from tempered versions of the target distribution.
We present an adaptive temperature selection algorithm that dynamically adjusts temperatures during sampling using a policy gradient approach.
arXiv Detail & Related papers (2024-09-03T03:12:45Z) - Sequential Monte Carlo for Inclusive KL Minimization in Amortized Variational Inference [3.126959812401426]
We propose SMC-Wake, a procedure for fitting an amortized variational approximation that uses sequential Monte Carlo samplers to estimate the gradient of the inclusive KL divergence.
In experiments with both simulated and real datasets, SMC-Wake fits variational distributions that approximate the posterior more accurately than existing methods.
arXiv Detail & Related papers (2024-03-15T18:13:48Z) - Adaptive Annealed Importance Sampling with Constant Rate Progress [68.8204255655161]
Annealed Importance Sampling (AIS) synthesizes weighted samples from an intractable distribution.
We propose the Constant Rate AIS algorithm and its efficient implementation for $alpha$-divergences.
arXiv Detail & Related papers (2023-06-27T08:15:28Z) - Differentiating Metropolis-Hastings to Optimize Intractable Densities [51.16801956665228]
We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers.
We apply gradient-based optimization to objectives expressed as expectations over intractable target densities.
arXiv Detail & Related papers (2023-06-13T17:56:02Z) - Provable Phase Retrieval with Mirror Descent [1.1662472705038338]
We consider the problem of phase retrieval, which consists of recovering an $n$-m real vector from the magnitude of its behaviour.
For two measurements, we show that when the number of measurements $n$ is enough, then with high probability, for almost all initializers, the original vector recovers up to a sign.
arXiv Detail & Related papers (2022-10-17T16:40:02Z) - Implicit Bias of Gradient Descent on Reparametrized Models: On
Equivalence to Mirror Descent [64.26008239544085]
gradient flow with any commuting parametrization is equivalent to continuous mirror descent with a related Legendre function.
continuous mirror descent with any Legendre function can be viewed as gradient flow with a related commuting parametrization.
arXiv Detail & Related papers (2022-07-08T17:47:11Z) - Mirror Descent with Relative Smoothness in Measure Spaces, with
application to Sinkhorn and EM [11.007661197604065]
This paper studies the convergence of the mirror descent algorithm in an infinite-dimensional setting.
Applying our result to joint distributions and the Kullback--Leibler divergence, we show that Sinkhorn's primal iterations for optimal transport correspond to a mirror descent.
arXiv Detail & Related papers (2022-06-17T16:19:47Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Learning High-Precision Bounding Box for Rotated Object Detection via
Kullback-Leibler Divergence [100.6913091147422]
Existing rotated object detectors are mostly inherited from the horizontal detection paradigm.
In this paper, we are motivated to change the design of rotation regression loss from induction paradigm to deduction methodology.
arXiv Detail & Related papers (2021-06-03T14:29:19Z) - Observation of Hermitian and Non-Hermitian Diabolic Points and
Exceptional Rings in Parity-Time symmetric ZRC and RLC Dimers [62.997667081978825]
We show how appears non-Hermitian degeneracy points in the spectrum and how they are protected against a Hermitian perturbation.
This work opens a gold road for investigations on topological electrical circuits for robust transport of information at room temperature.
arXiv Detail & Related papers (2020-04-17T15:51:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.