Consistency Models as Plug-and-Play Priors for Inverse Problems
- URL: http://arxiv.org/abs/2509.22736v1
- Date: Thu, 25 Sep 2025 20:27:56 GMT
- Title: Consistency Models as Plug-and-Play Priors for Inverse Problems
- Authors: Merve Gülle, Junno Yun, Yaşar Utku Alçalar, Mehmet Akçakaya,
- Abstract summary: Diffusion inverse problem solvers aim to sample from the posterior of data given the measurements using a combination of the unconditional score function and approximation of the posterior related to the forward process.<n>We evaluate our approach on a variety of inverse problems, including inpainting, gradient noise injection and magnetic resonance imaging reconstruction.<n>To the best of our knowledge, this is the first CM trained for MRI.
- Score: 4.129847064263056
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Diffusion models have found extensive use in solving numerous inverse problems. Such diffusion inverse problem solvers aim to sample from the posterior distribution of data given the measurements, using a combination of the unconditional score function and an approximation of the posterior related to the forward process. Recently, consistency models (CMs) have been proposed to directly predict the final output from any point on the diffusion ODE trajectory, enabling high-quality sampling in just a few NFEs. CMs have also been utilized for inverse problems, but existing CM-based solvers either require additional task-specific training or utilize data fidelity operations with slow convergence, not amenable to large-scale problems. In this work, we reinterpret CMs as proximal operators of a prior, enabling their integration into plug-and-play (PnP) frameworks. We propose a solver based on PnP-ADMM, which enables us to leverage the fast convergence of conjugate gradient method. We further accelerate this with noise injection and momentum, dubbed PnP-CM, and show it maintains the convergence properties of the baseline PnP-ADMM. We evaluate our approach on a variety of inverse problems, including inpainting, super-resolution, Gaussian deblurring, and magnetic resonance imaging (MRI) reconstruction. To the best of our knowledge, this is the first CM trained for MRI datasets. Our results show that PnP-CM achieves high-quality reconstructions in as few as 4 NFEs, and can produce meaningful results in 2 steps, highlighting its effectiveness in real-world inverse problems while outperforming comparable CM-based approaches.
Related papers
- Plug-and-Play Diffusion Meets ADMM: Dual-Variable Coupling for Robust Medical Image Reconstruction [45.25461515976432]
Plug-and-Play diffusion prior (DP) frameworks have emerged as a powerful paradigm for imaging reconstruction.<n>We present a novel approach to resolving bias-hallucination trade-off, achieving state-of-the-art gradients with significantly accelerated convergence.
arXiv Detail & Related papers (2026-02-26T16:58:43Z) - Diffusion Models for Solving Inverse Problems via Posterior Sampling with Piecewise Guidance [52.705112811734566]
A novel diffusion-based framework is introduced for solving inverse problems using a piecewise guidance scheme.<n>The proposed method is problem-agnostic and readily adaptable to a variety of inverse problems.<n>The framework achieves a reduction in inference time of (25%) for inpainting with both random and center masks, and (23%) and (24%) for (4times) and (8times) super-resolution tasks.
arXiv Detail & Related papers (2025-07-22T19:35:14Z) - Solving Inverse Problems via Diffusion-Based Priors: An Approximation-Free Ensemble Sampling Approach [19.860268382547357]
Current DM-based posterior sampling methods rely on approximations to the generative process.<n>We propose an ensemble-based algorithm that performs posterior sampling without the use of approximations.<n>Our algorithm is motivated by existing works that combine DM-based methods with the sequential Monte Carlo method.
arXiv Detail & Related papers (2025-06-04T14:09:25Z) - Integrating Intermediate Layer Optimization and Projected Gradient Descent for Solving Inverse Problems with Diffusion Models [19.445391508424667]
Inverse problems (IPs) involve reconstructing signals from noisy observations.<n>DMs have emerged as a powerful framework for solving IPs, achieving remarkable reconstruction performance.<n>Existing DM-based methods frequently encounter issues such as heavy computational demands and suboptimal convergence.<n>We propose two novel methods, DMILO and DMILO-PGD, to address these challenges.
arXiv Detail & Related papers (2025-05-27T06:49:02Z) - Guided Diffusion Sampling on Function Spaces with Applications to PDEs [111.87523128566781]
We propose a general framework for conditional sampling in PDE-based inverse problems.<n>This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning.<n>Our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines.
arXiv Detail & Related papers (2025-05-22T17:58:12Z) - Think Twice Before You Act: Improving Inverse Problem Solving With MCMC [40.5682961122897]
We propose textbfDiffusion textbfPosterior textbfMCMC (textbfDPMC) to solve inverse problems with pretrained diffusion models.
Our algorithm outperforms DPS with less number of evaluations across nearly all tasks, and is competitive among existing approaches.
arXiv Detail & Related papers (2024-09-13T06:10:54Z) - Amortized Posterior Sampling with Diffusion Prior Distillation [55.03585818289934]
Amortized Posterior Sampling is a novel variational inference approach for efficient posterior sampling in inverse problems.<n>Our method trains a conditional flow model to minimize the divergence between the variational distribution and the posterior distribution implicitly defined by the diffusion model.<n>Unlike existing methods, our approach is unsupervised, requires no paired training data, and is applicable to both Euclidean and non-Euclidean domains.
arXiv Detail & Related papers (2024-07-25T09:53:12Z) - CoSIGN: Few-Step Guidance of ConSIstency Model to Solve General INverse Problems [3.3969056208620128]
We propose to push the boundary of inference steps to 1-2 NFEs while still maintaining high reconstruction quality.
Our method achieves new state-of-the-art in diffusion-based inverse problem solving.
arXiv Detail & Related papers (2024-07-17T15:57:50Z) - BlindDiff: Empowering Degradation Modelling in Diffusion Models for Blind Image Super-Resolution [52.47005445345593]
BlindDiff is a DM-based blind SR method to tackle the blind degradation settings in SISR.
BlindDiff seamlessly integrates the MAP-based optimization into DMs.
Experiments on both synthetic and real-world datasets show that BlindDiff achieves the state-of-the-art performance.
arXiv Detail & Related papers (2024-03-15T11:21:34Z) - Ambient Diffusion Posterior Sampling: Solving Inverse Problems with Diffusion Models Trained on Corrupted Data [54.09959775518994]
We provide a framework for solving inverse problems with diffusion models learned from linearly corrupted data.<n>We train diffusion models for MRI with access only to subsampled multi-coil measurements at acceleration factors R= 2,4,6,8.<n>For MRI reconstruction in high acceleration regimes, we observe that A-DPS models trained on subsampled data are better suited to solving inverse problems than models trained on fully sampled data.
arXiv Detail & Related papers (2024-03-13T17:28:20Z) - vSHARP: variable Splitting Half-quadratic Admm algorithm for Reconstruction of inverse-Problems [7.043932618116216]
vSHARP (variable Splitting Half-quadratic ADMM algorithm for Reconstruction of inverse Problems) is a novel Deep Learning (DL)-based method for solving ill-posed inverse problems arising in Medical Imaging (MI)
For data consistency, vSHARP unrolls a differentiable gradient descent process in the image domain, while a DL-based denoiser, such as a U-Net architecture, is applied to enhance image quality.
Our comparative analysis with state-of-the-art methods demonstrates the superior performance of vSHARP in these applications.
arXiv Detail & Related papers (2023-09-18T17:26:22Z) - Moreau Envelope ADMM for Decentralized Weakly Convex Optimization [55.2289666758254]
This paper proposes a proximal variant of the alternating direction method of multipliers (ADMM) for distributed optimization.
The results of our numerical experiments indicate that our method is faster and more robust than widely-used approaches.
arXiv Detail & Related papers (2023-08-31T14:16:30Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.