Efficient Approximate Posterior Sampling with Annealed Langevin Monte Carlo
- URL: http://arxiv.org/abs/2508.07631v2
- Date: Mon, 13 Oct 2025 03:05:04 GMT
- Title: Efficient Approximate Posterior Sampling with Annealed Langevin Monte Carlo
- Authors: Advait Parulekar, Litu Rout, Karthikeyan Shanmugam, Sanjay Shakkottai,
- Abstract summary: We study the problem of posterior sampling in the context of score based generative models.<n>We show that one can tractably sample from a distribution that is simultaneously close to the posterior of a noised prior in KL divergence and the true posterior in Fisher divergence.
- Score: 36.938523047101555
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of posterior sampling in the context of score based generative models. We have a trained score network for a prior $p(x)$, a measurement model $p(y|x)$, and are tasked with sampling from the posterior $p(x|y)$. Prior work has shown this to be intractable in KL (in the worst case) under well-accepted computational hardness assumptions. Despite this, popular algorithms for tasks such as image super-resolution, stylization, and reconstruction enjoy empirical success. Rather than establishing distributional assumptions or restricted settings under which exact posterior sampling is tractable, we view this as a more general "tilting" problem of biasing a distribution towards a measurement. Under minimal assumptions, we show that one can tractably sample from a distribution that is simultaneously close to the posterior of a noised prior in KL divergence and the true posterior in Fisher divergence. Intuitively, this combination ensures that the resulting sample is consistent with both the measurement and the prior. To the best of our knowledge these are the first formal results for (approximate) posterior sampling in polynomial time.
Related papers
- Provable Diffusion Posterior Sampling for Bayesian Inversion [13.807494493914335]
This paper proposes a novel diffusion-based posterior sampling method within a plug-and-play framework.<n>To approximate the posterior score, we develop a Monte Carlo estimator in which particles are generated using Langevin dynamics.<n>On the theoretical side, we provide non-asymptotic error bounds, showing that the method converges even for complex multi-modal target posterior.
arXiv Detail & Related papers (2025-12-08T20:34:05Z) - Enhancing Diffusion Posterior Sampling for Inverse Problems by Integrating Crafted Measurements [46.03835001280626]
Current posterior sampling-based methods take the measurement into the posterior sampling to infer the distribution of the target data.<n>We show that high-frequency information can be prematurely introduced during the early stages, which could induce larger posterior estimate errors.<n>We propose a novel diffusion posterior sampling method DPS-CM, which incorporates a Crafted Measurement (i.e., noisy measurement crafted by a reverse denoising process) to form the posterior estimate.
arXiv Detail & Related papers (2024-11-15T00:06:57Z) - Online Posterior Sampling with a Diffusion Prior [20.24212000441531]
Posterior sampling in contextual bandits with a Gaussian prior can be implemented exactly or approximately using the Laplace approximation.
In this work, we propose approximate posterior sampling algorithms for contextual bandits with a diffusion model prior.
arXiv Detail & Related papers (2024-10-04T20:47:16Z) - Bayesian evidence estimation from posterior samples with normalizing flows [0.0]
We propose a novel method to estimate the Bayesian evidence (and its numerical uncertainty) from a set of samples drawn from the unnormalized posterior distribution.<n>We validate it on distributions whose evidence is known analytically, up to 15 parameter space dimensions, and compare with two state-of-the-art techniques.<n>$floZ$ has wide applicability, e.g., to estimate evidence from variational inference, Markov Chain Monte Carlo samples, or any other method that delivers samples and their likelihood from the unnormalized posterior density.
arXiv Detail & Related papers (2024-04-18T16:16:02Z) - Diffusion Posterior Sampling is Computationally Intractable [9.747854308906506]
Posterior sampling is useful for tasks such as inpainting, super-resolution, and MRI reconstruction.<n>We show that posterior sampling is computationally intractable under the most basic assumption in cryptography.<n>We also show that the exponential-time rejection sampling algorithm is essentially optimal under the stronger plausible assumption that there are one-way functions that take exponential time to invert.
arXiv Detail & Related papers (2024-02-20T05:28:13Z) - Posterior samples of source galaxies in strong gravitational lenses with
score-based priors [107.52670032376555]
We use a score-based model to encode the prior for the inference of undistorted images of background galaxies.
We show how the balance between the likelihood and the prior meet our expectations in an experiment with out-of-distribution data.
arXiv Detail & Related papers (2022-11-07T19:00:42Z) - Langevin Monte Carlo for Contextual Bandits [72.00524614312002]
Langevin Monte Carlo Thompson Sampling (LMC-TS) is proposed to directly sample from the posterior distribution in contextual bandits.
We prove that the proposed algorithm achieves the same sublinear regret bound as the best Thompson sampling algorithms for a special case of contextual bandits.
arXiv Detail & Related papers (2022-06-22T17:58:23Z) - Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov
Chain Monte Carlo Methods [13.649384403827359]
Normalizing flows can generate complex target distributions and show promise in many applications in Bayesian statistics.
Since no data set from the target posterior distribution is available beforehand, the flow is typically trained using the reverse Kullback-Leibler (KL) divergence that only requires samples from a base distribution.
Here we explore a distinct training strategy, using the direct KL divergence as loss, in which samples from the posterior are generated by (i) assisting a local MCMC algorithm on the posterior with a normalizing flow to accelerate its mixing rate and (ii) using the data generated this way to train the flow.
arXiv Detail & Related papers (2021-07-16T16:40:36Z) - Instance-Optimal Compressed Sensing via Posterior Sampling [101.43899352984774]
We show for Gaussian measurements and emphany prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees.
We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
arXiv Detail & Related papers (2021-06-21T22:51:56Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Distributionally Robust Bayesian Quadrature Optimization [60.383252534861136]
We study BQO under distributional uncertainty in which the underlying probability distribution is unknown except for a limited set of its i.i.d. samples.
A standard BQO approach maximizes the Monte Carlo estimate of the true expected objective given the fixed sample set.
We propose a novel posterior sampling based algorithm, namely distributionally robust BQO (DRBQO) for this purpose.
arXiv Detail & Related papers (2020-01-19T12:00:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.