Integrating Reweighted Least Squares with Plug-and-Play Diffusion Priors for Noisy Image Restoration
- URL: http://arxiv.org/abs/2511.06823v1
- Date: Mon, 10 Nov 2025 08:11:20 GMT
- Title: Integrating Reweighted Least Squares with Plug-and-Play Diffusion Priors for Noisy Image Restoration
- Authors: Ji Li, Chao Wang,
- Abstract summary: We propose a plug-and-play image restoration framework based on generative diffusion priors for robust removal of general noise types, including impulse noise.<n> Experimental results on benchmark datasets demonstrate that the proposed method effectively removes non-Gaussian impulse noise and achieves superior restoration performance.
- Score: 6.402777145722335
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Existing plug-and-play image restoration methods typically employ off-the-shelf Gaussian denoisers as proximal operators within classical optimization frameworks based on variable splitting. Recently, denoisers induced by generative priors have been successfully integrated into regularized optimization methods for image restoration under Gaussian noise. However, their application to non-Gaussian noise--such as impulse noise--remains largely unexplored. In this paper, we propose a plug-and-play image restoration framework based on generative diffusion priors for robust removal of general noise types, including impulse noise. Within the maximum a posteriori (MAP) estimation framework, the data fidelity term is adapted to the specific noise model. Departing from the conventional least-squares loss used for Gaussian noise, we introduce a generalized Gaussian scale mixture-based loss, which approximates a wide range of noise distributions and leads to an $\ell_q$-norm ($0<q\leq2$) fidelity term. This optimization problem is addressed using an iteratively reweighted least squares (IRLS) approach, wherein the proximal step involving the generative prior is efficiently performed via a diffusion-based denoiser. Experimental results on benchmark datasets demonstrate that the proposed method effectively removes non-Gaussian impulse noise and achieves superior restoration performance.
Related papers
- DenoiseGS: Gaussian Reconstruction Model for Burst Denoising [29.507111067242118]
We propose DenoiseGS, the first framework to leverage the efficiency of 3D Gaussian Splatting for burst denoising.<n>Our approach addresses two key challenges when applying feedforward Gaussian reconsturction model to noisy inputs.<n>Experiments demonstrate that DenoiseGS significantly exceeds the state-of-the-art NeRF-based methods on both burst denoising and novel view synthesis.
arXiv Detail & Related papers (2025-11-28T07:29:54Z) - Image Restoration via Primal Dual Hybrid Gradient and Flow Generative Model [6.402777145722335]
Regularized robustness has been a classical approach to solving imaging inverse problems, where the regularization term enforces desirable properties of the unknown image.<n>Recently, the integration of flow matching generative image restoration has garnered significant attention, owing to their powerful prior modeling capabilities.<n>In this work, we incorporate such generative priors into a Plug-and-Play framework based on splitting a model, where the operator associated with the regularizer is replaced by a time-dependent deblurr derived from the model.
arXiv Detail & Related papers (2025-11-10T06:26:36Z) - Mitigating the Noise Shift for Denoising Generative Models via Noise Awareness Guidance [54.88271057438763]
Noise Awareness Guidance (NAG) is a correction method that explicitly steers sampling trajectories to remain consistent with the pre-defined noise schedule.<n>NAG consistently mitigates noise shift and substantially improves the generation quality of mainstream diffusion models.
arXiv Detail & Related papers (2025-10-14T13:31:34Z) - Score-Based Turbo Message Passing for Plug-and-Play Compressive Image Recovery [24.60447255507278]
Off-the-shelf image denoisers mostly rely on some generic or hand-crafted priors for denoising.<n>We devise a message passing framework that integrates a score-based minimum mean squared error (MMSE) denoiser for compressive image recovery.
arXiv Detail & Related papers (2025-03-28T04:30:58Z) - Arbitrary-steps Image Super-resolution via Diffusion Inversion [68.78628844966019]
This study presents a new image super-resolution (SR) technique based on diffusion inversion, aiming at harnessing the rich image priors encapsulated in large pre-trained diffusion models to improve SR performance.<n>We design a Partial noise Prediction strategy to construct an intermediate state of the diffusion model, which serves as the starting sampling point.<n>Once trained, this noise predictor can be used to initialize the sampling process partially along the diffusion trajectory, generating the desirable high-resolution result.
arXiv Detail & Related papers (2024-12-12T07:24:13Z) - Beyond Image Prior: Embedding Noise Prior into Conditional Denoising Transformer [17.430622649002427]
Existing learning-based denoising methods typically train models to generalize the image prior from large-scale datasets.<n>We propose a new perspective on the denoising challenge by highlighting the distinct separation between noise and image priors.<n>We introduce a Locally Noise Prior Estimation algorithm, which accurately estimates the noise prior directly from a single raw noisy image.
arXiv Detail & Related papers (2024-07-12T08:43:11Z) - Diffusion Gaussian Mixture Audio Denoise [23.760755498636943]
We propose a DiffGMM model, a denoising model based on the diffusion and Gaussian mixture models.
Given a noisy audio signal, we first apply a 1D-U-Net to extract features and train linear layers to estimate parameters for the Gaussian mixture model.
The noisy signal is continuously subtracted from the estimated noise to output clean audio signals.
arXiv Detail & Related papers (2024-06-13T14:18:10Z) - DiffusionAD: Norm-guided One-step Denoising Diffusion for Anomaly Detection [80.20339155618612]
DiffusionAD is a novel anomaly detection pipeline comprising a reconstruction sub-network and a segmentation sub-network.<n>A rapid one-step denoising paradigm achieves hundreds of times acceleration while preserving comparable reconstruction quality.<n>Considering the diversity in the manifestation of anomalies, we propose a norm-guided paradigm to integrate the benefits of multiple noise scales.
arXiv Detail & Related papers (2023-03-15T16:14:06Z) - Diffusion Posterior Sampling for General Noisy Inverse Problems [50.873313752797124]
We extend diffusion solvers to handle noisy (non)linear inverse problems via approximation of the posterior sampling.
Our method demonstrates that diffusion models can incorporate various measurement noise statistics.
arXiv Detail & Related papers (2022-09-29T11:12:27Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.