Preconditioned Plug-and-Play ADMM with Locally Adjustable Denoiser for
Image Restoration
- URL: http://arxiv.org/abs/2110.00493v1
- Date: Fri, 1 Oct 2021 15:46:35 GMT
- Title: Preconditioned Plug-and-Play ADMM with Locally Adjustable Denoiser for
Image Restoration
- Authors: Mikael Le Pendu and Christine Guillemot
- Abstract summary: We extend the concept of plug-and-play optimization to use denoisers that can be parameterized for non-constant noise variance.
We show that our pixel-wise adjustable denoiser, along with a suitable preconditioning strategy, can further improve the plug-and-play ADMM approach for several applications.
- Score: 54.23646128082018
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Plug-and-Play optimization recently emerged as a powerful technique for
solving inverse problems by plugging a denoiser into a classical optimization
algorithm. The denoiser accounts for the regularization and therefore
implicitly determines the prior knowledge on the data, hence replacing typical
handcrafted priors. In this paper, we extend the concept of plug-and-play
optimization to use denoisers that can be parameterized for non-constant noise
variance. In that aim, we introduce a preconditioning of the ADMM algorithm,
which mathematically justifies the use of such an adjustable denoiser. We
additionally propose a procedure for training a convolutional neural network
for high quality non-blind image denoising that also allows for pixel-wise
control of the noise standard deviation. We show that our pixel-wise adjustable
denoiser, along with a suitable preconditioning strategy, can further improve
the plug-and-play ADMM approach for several applications, including image
completion, interpolation, demosaicing and Poisson denoising.
Related papers
- Beyond Image Prior: Embedding Noise Prior into Conditional Denoising Transformer [17.430622649002427]
Existing learning-based denoising methods typically train models to generalize the image prior from large-scale datasets.
We propose a new perspective on the denoising challenge by highlighting the distinct separation between noise and image priors.
We introduce a Locally Noise Prior Estimation algorithm, which accurately estimates the noise prior directly from a single raw noisy image.
arXiv Detail & Related papers (2024-07-12T08:43:11Z) - Score Priors Guided Deep Variational Inference for Unsupervised
Real-World Single Image Denoising [14.486289176696438]
We propose a score priors-guided deep variational inference, namely ScoreDVI, for practical real-world denoising.
We exploit a Non-$i.i.d$ Gaussian mixture model and variational noise posterior to model the real-world noise.
Our method outperforms other single image-based real-world denoising methods and achieves comparable performance to dataset-based unsupervised methods.
arXiv Detail & Related papers (2023-08-09T03:26:58Z) - Advancing Unsupervised Low-light Image Enhancement: Noise Estimation, Illumination Interpolation, and Self-Regulation [55.07472635587852]
Low-Light Image Enhancement (LLIE) techniques have made notable advancements in preserving image details and enhancing contrast.
These approaches encounter persistent challenges in efficiently mitigating dynamic noise and accommodating diverse low-light scenarios.
We first propose a method for estimating the noise level in low light images in a quick and accurate way.
We then devise a Learnable Illumination Interpolator (LII) to satisfy general constraints between illumination and input.
arXiv Detail & Related papers (2023-05-17T13:56:48Z) - Learning Sparsity-Promoting Regularizers using Bilevel Optimization [9.18465987536469]
We present a method for supervised learning of sparsity-promoting regularizers for denoising signals and images.
Experiments with structured 1D signals and natural images show that the proposed method can learn an operator that outperforms well-known regularizers.
arXiv Detail & Related papers (2022-07-18T20:50:02Z) - Learned Gradient of a Regularizer for Plug-and-Play Gradient Descent [37.41458921829744]
The Plug-and-Play framework allows integrating advanced image denoising priors into algorithms.
Regularization by Denoising (RED) algorithms are two examples of methods that made a breakthrough in image restoration.
We show that it is possible to train a denoiser along with a network that corresponds to the gradient of its regularizer.
arXiv Detail & Related papers (2022-04-29T08:33:33Z) - On Measuring and Controlling the Spectral Bias of the Deep Image Prior [63.88575598930554]
The deep image prior has demonstrated the remarkable ability that untrained networks can address inverse imaging problems.
It requires an oracle to determine when to stop the optimization as the performance degrades after reaching a peak.
We study the deep image prior from a spectral bias perspective to address these problems.
arXiv Detail & Related papers (2021-07-02T15:10:42Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - TFPnP: Tuning-free Plug-and-Play Proximal Algorithm with Applications to
Inverse Imaging Problems [22.239477171296056]
Plug-and-Play (MM) is a non- optimization framework that combines numerical algorithms, for example, with advanced denoising priors.
We discuss several practical considerations of more denoisers, which together with our learned strategies are state-of-the-art results.
arXiv Detail & Related papers (2020-11-18T14:19:30Z) - A Flexible Framework for Designing Trainable Priors with Adaptive
Smoothing and Game Encoding [57.1077544780653]
We introduce a general framework for designing and training neural network layers whose forward passes can be interpreted as solving non-smooth convex optimization problems.
We focus on convex games, solved by local agents represented by the nodes of a graph and interacting through regularization functions.
This approach is appealing for solving imaging problems, as it allows the use of classical image priors within deep models that are trainable end to end.
arXiv Detail & Related papers (2020-06-26T08:34:54Z) - Fully Unsupervised Diversity Denoising with Convolutional Variational
Autoencoders [81.30960319178725]
We propose DivNoising, a denoising approach based on fully convolutional variational autoencoders (VAEs)
First we introduce a principled way of formulating the unsupervised denoising problem within the VAE framework by explicitly incorporating imaging noise models into the decoder.
We show that such a noise model can either be measured, bootstrapped from noisy data, or co-learned during training.
arXiv Detail & Related papers (2020-06-10T21:28:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.