DAPS++: Rethinking Diffusion Inverse Problems with Decoupled Posterior Annealing
- URL: http://arxiv.org/abs/2511.17038v1
- Date: Fri, 21 Nov 2025 08:28:36 GMT
- Title: DAPS++: Rethinking Diffusion Inverse Problems with Decoupled Posterior Annealing
- Authors: Hao Chen, Renzheng Zhang, Scott S. Howard,
- Abstract summary: We introduce textbfDAPS++, which allows the likelihood term to guide inference more directly while maintaining numerical stability.<n>textbfDAPS++ achieves high computational efficiency and robust reconstruction performance across diverse image restoration tasks.
- Score: 5.215481191227242
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: From a Bayesian perspective, score-based diffusion solves inverse problems through joint inference, embedding the likelihood with the prior to guide the sampling process. However, this formulation fails to explain its practical behavior: the prior offers limited guidance, while reconstruction is largely driven by the measurement-consistency term, leading to an inference process that is effectively decoupled from the diffusion dynamics. To clarify this structure, we reinterpret the role of diffusion in inverse problem solving as an initialization stage within an expectation--maximization (EM)--style framework, where the diffusion stage and the data-driven refinement are fully decoupled. We introduce \textbf{DAPS++}, which allows the likelihood term to guide inference more directly while maintaining numerical stability and providing insight into why unified diffusion trajectories remain effective in practice. By requiring fewer function evaluations (NFEs) and measurement-optimization steps, \textbf{DAPS++} achieves high computational efficiency and robust reconstruction performance across diverse image restoration tasks.
Related papers
- Fast and Robust Likelihood-Guided Diffusion Posterior Sampling with Amortized Variational Inference [22.234558509800426]
We introduce an amortization strategy for diffusion posterior sampling that preserves explicit likelihood guidance.<n>This strategy improves the trade-off between efficiency and flexibility in diffusion-based inverse problems.
arXiv Detail & Related papers (2026-02-06T16:24:35Z) - Align & Invert: Solving Inverse Problems with Diffusion and Flow-based Models via Representational Alignment [13.028121107802127]
In inverse problems, pretrained generative models are employed as priors.<n>We propose applying representation alignment (REPA) between diffusion or flow-based models and a pretrained self-supervised visual encoder.<n>We show that aligning model representations with approximate target features can substantially enhance reconstruction fidelity and perceptual realism.
arXiv Detail & Related papers (2025-11-21T00:37:04Z) - Diffusion Models: A Mathematical Introduction [3.8673630752805437]
We present a self-contained derivation of diffusion-based generative models.<n>We construct denoising diffusion probabilistic models from first principles.<n>Readers can both follow the theory and implement the corresponding algorithms in practice.
arXiv Detail & Related papers (2025-11-13T16:20:52Z) - TAG:Tangential Amplifying Guidance for Hallucination-Resistant Diffusion Sampling [53.61290359948953]
Tangential Amplifying Guidance (TAG) operates solely on trajectory signals without modifying the underlying diffusion model.<n>We formalize this guidance process by leveraging a first-order Taylor expansion.<n> TAG is a plug-and-play, architecture-agnostic module that improves diffusion sampling fidelity with minimal computational addition.
arXiv Detail & Related papers (2025-10-06T06:53:29Z) - Diffusion Bridge Variational Inference for Deep Gaussian Processes [31.082191748525137]
Diffusion Bridge Variational Inference (DBVI) is a principled extension of Denoising diffusion variational inference (DDVI)<n>DBVI initiates the reverse diffusion from a learnable, data-dependent initial distribution.<n>It consistently outperforms DDVI and other variational baselines in predictive accuracy, convergence speed, and posterior quality.
arXiv Detail & Related papers (2025-09-23T14:36:47Z) - Diffusion Models for Solving Inverse Problems via Posterior Sampling with Piecewise Guidance [52.705112811734566]
A novel diffusion-based framework is introduced for solving inverse problems using a piecewise guidance scheme.<n>The proposed method is problem-agnostic and readily adaptable to a variety of inverse problems.<n>The framework achieves a reduction in inference time of (25%) for inpainting with both random and center masks, and (23%) and (24%) for (4times) and (8times) super-resolution tasks.
arXiv Detail & Related papers (2025-07-22T19:35:14Z) - Improving Decoupled Posterior Sampling for Inverse Problems using Data Consistency Constraint [13.285652967956652]
We propose Guided Decoupled Posterior Sampling (GDPS) to solve inverse problems.<n>We extend our method to latent diffusion models and Tweedie's formula.<n>GDPS achieves state-of-the-art performance, improving accuracy over existing methods.
arXiv Detail & Related papers (2024-12-01T03:57:21Z) - Efficient Diffusion as Low Light Enhancer [63.789138528062225]
Reflectance-Aware Trajectory Refinement (RATR) is a simple yet effective module to refine the teacher trajectory using the reflectance component of images.
textbfReflectance-aware textbfDiffusion with textbfDistilled textbfTrajectory (textbfReDDiT) is an efficient and flexible distillation framework tailored for Low-Light Image Enhancement (LLIE)
arXiv Detail & Related papers (2024-10-16T08:07:18Z) - Diffusion State-Guided Projected Gradient for Inverse Problems [82.24625224110099]
We propose Diffusion State-Guided Projected Gradient (DiffStateGrad) for inverse problems.<n>DiffStateGrad projects the measurement gradient onto a subspace that is a low-rank approximation of an intermediate state of the diffusion process.<n>We highlight that DiffStateGrad improves the robustness of diffusion models in terms of the choice of measurement guidance step size and noise.
arXiv Detail & Related papers (2024-10-04T14:26:54Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
Amortized Posterior Sampling is a novel variational inference approach for efficient posterior sampling in inverse problems.<n>Our method trains a conditional flow model to minimize the divergence between the variational distribution and the posterior distribution implicitly defined by the diffusion model.<n>Unlike existing methods, our approach is unsupervised, requires no paired training data, and is applicable to both Euclidean and non-Euclidean domains.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - Efficient Text-driven Motion Generation via Latent Consistency Training [21.348658259929053]
We propose a motion latent consistency training framework (MLCT) to solve nonlinear reverse diffusion trajectories.<n>By combining these enhancements, we achieve stable and consistency training in non-pixel modality and latent representation spaces.
arXiv Detail & Related papers (2024-05-05T02:11:57Z) - Regularization by Texts for Latent Diffusion Inverse Solvers [55.97917698941313]
We introduce a novel latent diffusion inverse solver, regularization by text (TReg), inspired by the human ability to resolve visual ambiguities through perceptual biases.<n>Our experimental results demonstrate that TReg effectively mitigates ambiguity in inverse problems, improving both accuracy and efficiency.
arXiv Detail & Related papers (2023-11-27T09:40:14Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.