Fast and Robust Likelihood-Guided Diffusion Posterior Sampling with Amortized Variational Inference
- URL: http://arxiv.org/abs/2602.07102v1
- Date: Fri, 06 Feb 2026 16:24:35 GMT
- Title: Fast and Robust Likelihood-Guided Diffusion Posterior Sampling with Amortized Variational Inference
- Authors: Léon Zheng, Thomas Hirtz, Yazid Janati, Eric Moulines,
- Abstract summary: We introduce an amortization strategy for diffusion posterior sampling that preserves explicit likelihood guidance.<n>This strategy improves the trade-off between efficiency and flexibility in diffusion-based inverse problems.
- Score: 22.234558509800426
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Zero-shot diffusion posterior sampling offers a flexible framework for inverse problems by accommodating arbitrary degradation operators at test time, but incurs high computational cost due to repeated likelihood-guided updates. In contrast, previous amortized diffusion approaches enable fast inference by replacing likelihood-based sampling with implicit inference models, but at the expense of robustness to unseen degradations. We introduce an amortization strategy for diffusion posterior sampling that preserves explicit likelihood guidance by amortizing the inner optimization problems arising in variational diffusion posterior sampling. This accelerates inference for in-distribution degradations while maintaining robustness to previously unseen operators, thereby improving the trade-off between efficiency and flexibility in diffusion-based inverse problems.
Related papers
- U-DAVI: Uncertainty-Aware Diffusion-Prior-Based Amortized Variational Inference for Image Reconstruction [10.273906387994902]
Ill-posed imaging inverse problems remain challenging due to the ambiguity in mapping degraded observations to clean images.<n>Amortized variational inference frameworks address this inefficiency by learning a direct mapping from measurements to posteriors.<n>We extend the amortized framework by injecting spatially adaptive perturbations to measurements during training, guided by uncertainty estimates, to emphasize learning in the most uncertain regions.
arXiv Detail & Related papers (2026-02-12T08:32:11Z) - On Stability and Robustness of Diffusion Posterior Sampling for Bayesian Inverse Problems [42.76879947185353]
Diffusion-based solvers rely on a presumed likelihood for the observations in BIPs to guide the generation process.<n>We bridge this gap by characterizing the posterior approximation error and proving the emphstability of the diffusion-based solvers.<n>We propose a simple yet effective solution, emphrobust diffusion posterior sampling, which is provably emphrobust and compatible with existing gradient-based posterior samplers.
arXiv Detail & Related papers (2026-02-02T12:47:15Z) - DAPS++: Rethinking Diffusion Inverse Problems with Decoupled Posterior Annealing [5.215481191227242]
We introduce textbfDAPS++, which allows the likelihood term to guide inference more directly while maintaining numerical stability.<n>textbfDAPS++ achieves high computational efficiency and robust reconstruction performance across diverse image restoration tasks.
arXiv Detail & Related papers (2025-11-21T08:28:36Z) - Test-Time Anchoring for Discrete Diffusion Posterior Sampling [38.507644561076894]
Posterior sampling is a challenging problem for pretrained discrete diffusion foundation models.<n>We introduce Anchored Posterior Sampling (APS) for masked diffusion foundation models.<n>Our approach achieves state-of-the-art performance among discrete diffusion samplers across linear and nonlinear inverse problems.
arXiv Detail & Related papers (2025-10-02T17:58:37Z) - EquiReg: Equivariance Regularized Diffusion for Inverse Problems [67.01847869495558]
We propose EquiReg diffusion, a framework for regularizing posterior sampling in diffusion-based inverse problem solvers.<n>When applied to a variety of solvers, EquiReg outperforms state-of-the-art diffusion models in both linear and nonlinear image restoration tasks.
arXiv Detail & Related papers (2025-05-29T01:25:43Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
Amortized Posterior Sampling is a novel variational inference approach for efficient posterior sampling in inverse problems.<n>Our method trains a conditional flow model to minimize the divergence between the variational distribution and the posterior distribution implicitly defined by the diffusion model.<n>Unlike existing methods, our approach is unsupervised, requires no paired training data, and is applicable to both Euclidean and non-Euclidean domains.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Diffusion Prior-Based Amortized Variational Inference for Noisy Inverse Problems [12.482127049881026]
We propose a novel approach to solve inverse problems with a diffusion prior from an amortized variational inference perspective.
Our amortized inference learns a function that directly maps measurements to the implicit posterior distributions of corresponding clean data, enabling a single-step posterior sampling even for unseen measurements.
arXiv Detail & Related papers (2024-07-23T02:14:18Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Regularization by Texts for Latent Diffusion Inverse Solvers [55.97917698941313]
We introduce a novel latent diffusion inverse solver, regularization by text (TReg), inspired by the human ability to resolve visual ambiguities through perceptual biases.<n>Our experimental results demonstrate that TReg effectively mitigates ambiguity in inverse problems, improving both accuracy and efficiency.
arXiv Detail & Related papers (2023-11-27T09:40:14Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - Truncated Diffusion Probabilistic Models and Diffusion-based Adversarial
Auto-Encoders [137.1060633388405]
Diffusion-based generative models learn how to generate the data by inferring a reverse diffusion chain.
We propose a faster and cheaper approach that adds noise not until the data become pure random noise.
We show that the proposed model can be cast as an adversarial auto-encoder empowered by both the diffusion process and a learnable implicit prior.
arXiv Detail & Related papers (2022-02-19T20:18:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.