Provable Diffusion Posterior Sampling for Bayesian Inversion
- URL: http://arxiv.org/abs/2512.08022v1
- Date: Mon, 08 Dec 2025 20:34:05 GMT
- Title: Provable Diffusion Posterior Sampling for Bayesian Inversion
- Authors: Jinyuan Chang, Chenguang Duan, Yuling Jiao, Ruoxuan Li, Jerry Zhijian Yang, Cheng Yuan,
- Abstract summary: This paper proposes a novel diffusion-based posterior sampling method within a plug-and-play framework.<n>To approximate the posterior score, we develop a Monte Carlo estimator in which particles are generated using Langevin dynamics.<n>On the theoretical side, we provide non-asymptotic error bounds, showing that the method converges even for complex multi-modal target posterior.
- Score: 13.807494493914335
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes a novel diffusion-based posterior sampling method within a plug-and-play (PnP) framework. Our approach constructs a probability transport from an easy-to-sample terminal distribution to the target posterior, using a warm-start strategy to initialize the particles. To approximate the posterior score, we develop a Monte Carlo estimator in which particles are generated using Langevin dynamics, avoiding the heuristic approximations commonly used in prior work. The score governing the Langevin dynamics is learned from data, enabling the model to capture rich structural features of the underlying prior distribution. On the theoretical side, we provide non-asymptotic error bounds, showing that the method converges even for complex, multi-modal target posterior distributions. These bounds explicitly quantify the errors arising from posterior score estimation, the warm-start initialization, and the posterior sampling procedure. Our analysis further clarifies how the prior score-matching error and the condition number of the Bayesian inverse problem influence overall performance. Finally, we present numerical experiments demonstrating the effectiveness of the proposed method across a range of inverse problems.
Related papers
- Supervised Guidance Training for Infinite-Dimensional Diffusion Models [47.65586147952857]
In inverse problems, the aim is to sample from a posterior distribution over functions obtained by conditioning a prior.<n>We prove that the models can be conditioned using an infinite-dimensional extension of Doob's $h$-transform.<n>We propose a simulation-free score matching objective (called Supervised Guidance Training) enabling efficient and stable posterior sampling.
arXiv Detail & Related papers (2026-01-28T16:39:39Z) - Solving Inverse Problems via Diffusion-Based Priors: An Approximation-Free Ensemble Sampling Approach [19.860268382547357]
Current DM-based posterior sampling methods rely on approximations to the generative process.<n>We propose an ensemble-based algorithm that performs posterior sampling without the use of approximations.<n>Our algorithm is motivated by existing works that combine DM-based methods with the sequential Monte Carlo method.
arXiv Detail & Related papers (2025-06-04T14:09:25Z) - A Mixture-Based Framework for Guiding Diffusion Models [19.83064246586143]
Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems.<n>Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems.<n>This work proposes a novel mixture approximation of these intermediate distributions.
arXiv Detail & Related papers (2025-02-05T16:26:06Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
Amortized Posterior Sampling is a novel variational inference approach for efficient posterior sampling in inverse problems.<n>Our method trains a conditional flow model to minimize the divergence between the variational distribution and the posterior distribution implicitly defined by the diffusion model.<n>Unlike existing methods, our approach is unsupervised, requires no paired training data, and is applicable to both Euclidean and non-Euclidean domains.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Bayesian Circular Regression with von Mises Quasi-Processes [57.88921637944379]
In this work we explore a family of expressive and interpretable distributions over circle-valued random functions.<n>For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Gibbs sampling.<n>We present experiments applying this model to the prediction of wind directions and the percentage of the running gait cycle as a function of joint angles.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Amortizing intractable inference in diffusion models for vision, language, and control [89.65631572949702]
This paper studies amortized sampling of the posterior over data, $mathbfxsim prm post(mathbfx)propto p(mathbfx)r(mathbfx)$, in a model that consists of a diffusion generative model prior $p(mathbfx)$ and a black-box constraint or function $r(mathbfx)$.<n>We prove the correctness of a data-free learning objective, relative trajectory balance, for training a diffusion model that samples from
arXiv Detail & Related papers (2024-05-31T16:18:46Z) - An Unconditional Representation of the Conditional Score in Infinite-Dimensional Linear Inverse Problems [5.340736751238338]
We propose an unconditional representation of the conditional score-function tailored to linear inverse problems.<n>We show that the conditional score can be derived exactly from a trained (unconditional) score using affine transformations.<n>Our approach is formulated in infinite-dimensional function spaces, making it inherently discretization-invariant.
arXiv Detail & Related papers (2024-05-24T15:33:27Z) - Divide-and-Conquer Posterior Sampling for Denoising Diffusion Priors [21.0128625037708]
We present an innovative framework, divide-and-conquer posterior sampling.
It reduces the approximation error associated with current techniques without the need for retraining.
We demonstrate the versatility and effectiveness of our approach for a wide range of Bayesian inverse problems.
arXiv Detail & Related papers (2024-03-18T01:47:24Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Instance-Optimal Compressed Sensing via Posterior Sampling [101.43899352984774]
We show for Gaussian measurements and emphany prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees.
We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.
arXiv Detail & Related papers (2021-06-21T22:51:56Z) - Parameterizing uncertainty by deep invertible networks, an application
to reservoir characterization [0.9176056742068814]
Uncertainty quantification for full-waveform inversion provides a probabilistic characterization of the ill-conditioning of the problem.
We propose an approach characterized by training a deep network that "pushes forward" Gaussian random inputs into the model space as if they were sampled from the actual posterior distribution.
arXiv Detail & Related papers (2020-04-16T18:37:56Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.