An Interpretation of Regularization by Denoising and its Application
with the Back-Projected Fidelity Term
- URL: http://arxiv.org/abs/2101.11599v1
- Date: Wed, 27 Jan 2021 18:45:35 GMT
- Title: An Interpretation of Regularization by Denoising and its Application
with the Back-Projected Fidelity Term
- Authors: Einav Yogev-Ofer, Tom Tirer, Raja Giryes
- Abstract summary: We show that the RED gradient can be seen as a (sub)gradient of a prior function--but taken at a denoised version of the point.
We propose to combine RED with the Back-Projection (BP) fidelity term rather than the common Least Squares (LS) term that is used in previous works.
- Score: 55.34375605313277
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The vast majority of image recovery tasks are ill-posed problems. As such,
methods that are based on optimization use cost functions that consist of both
fidelity and prior (regularization) terms. A recent line of works imposes the
prior by the Regularization by Denoising (RED) approach, which exploits the
good performance of existing image denoising engines. Yet, the relation of RED
to explicit prior terms is still not well understood, as previous work requires
too strong assumptions on the denoisers. In this paper, we make two
contributions. First, we show that the RED gradient can be seen as a
(sub)gradient of a prior function--but taken at a denoised version of the
point. As RED is typically applied with a relatively small noise level, this
interpretation indicates a similarity between RED and traditional gradients.
This leads to our second contribution: We propose to combine RED with the
Back-Projection (BP) fidelity term rather than the common Least Squares (LS)
term that is used in previous works. We show that the advantages of BP over LS
for image deblurring and super-resolution, which have been demonstrated for
traditional gradients, carry on to the RED approach.
Related papers
- Latent Multi-Relation Reasoning for GAN-Prior based Image
Super-Resolution [61.65012981435095]
LAREN is a graph-based disentanglement that constructs a superior disentangled latent space via hierarchical multi-relation reasoning.
We show that LAREN achieves superior large-factor image SR and outperforms the state-of-the-art consistently across multiple benchmarks.
arXiv Detail & Related papers (2022-08-04T19:45:21Z) - Online Deep Equilibrium Learning for Regularization by Denoising [20.331171081002957]
Plug-and-Play Equilibrium Priors (memory) and Regularization by Denoising (RED) are widely-used frameworks for solving inverse imaging problems by computing fixed-points.
We propose ODER as a new strategy for improving the efficiency of DEQ/RED on the total number of measurements.
Our numerical results suggest the potential improvements in training/testing complexity due to ODER on three distinct imaging applications.
arXiv Detail & Related papers (2022-05-25T21:06:22Z) - Monotonically Convergent Regularization by Denoising [19.631197002314305]
Regularization by denoising (RED) is a widely-used framework for solving inverse problems by leveraging image denoisers as image priors.
Recent work has reported the state-of-the-art performance of RED in a number of imaging applications using pre-trained deep neural nets as denoisers.
This work addresses this issue by developing a new monotone RED (MRED) algorithm, whose convergence does not require nonexpansiveness of the deep denoising prior.
arXiv Detail & Related papers (2022-02-10T11:32:41Z) - Unsupervised PET Reconstruction from a Bayesian Perspective [12.512270202705404]
DeepRED is a typical representation that combines DIP and regularization by denoising (RED)
In this article, we leverage DeepRED from a Bayesian perspective to reconstruct PET images from a single corrupted sinogram without any supervised or auxiliary information.
arXiv Detail & Related papers (2021-10-29T06:32:21Z) - Gaussian MRF Covariance Modeling for Efficient Black-Box Adversarial
Attacks [86.88061841975482]
We study the problem of generating adversarial examples in a black-box setting, where we only have access to a zeroth order oracle.
We use this setting to find fast one-step adversarial attacks, akin to a black-box version of the Fast Gradient Sign Method(FGSM)
We show that the method uses fewer queries and achieves higher attack success rates than the current state of the art.
arXiv Detail & Related papers (2020-10-08T18:36:51Z) - A Contrastive Learning Approach for Training Variational Autoencoder
Priors [137.62674958536712]
Variational autoencoders (VAEs) are one of the powerful likelihood-based generative models with applications in many domains.
One explanation for VAEs' poor generative quality is the prior hole problem: the prior distribution fails to match the aggregate approximate posterior.
We propose an energy-based prior defined by the product of a base prior distribution and a reweighting factor, designed to bring the base closer to the aggregate posterior.
arXiv Detail & Related papers (2020-10-06T17:59:02Z) - Async-RED: A Provably Convergent Asynchronous Block Parallel Stochastic
Method using Deep Denoising Priors [31.773305606551197]
Regularization by denoising (RED) is a recently developed framework for solving inverse problems by integrating advanced denoisers as image priors.
We propose a new asynchronous RED (ASYNC-RED) algorithm that enables asynchronous parallel processing of data, making it significantly faster than its serial counterparts for large-scale inverse problems.
arXiv Detail & Related papers (2020-10-03T23:55:36Z) - Deep Variational Network Toward Blind Image Restoration [60.45350399661175]
Blind image restoration is a common yet challenging problem in computer vision.
We propose a novel blind image restoration method, aiming to integrate both the advantages of them.
Experiments on two typical blind IR tasks, namely image denoising and super-resolution, demonstrate that the proposed method achieves superior performance over current state-of-the-arts.
arXiv Detail & Related papers (2020-08-25T03:30:53Z) - Regularization by Denoising via Fixed-Point Projection (RED-PRO) [34.89374374708481]
Regularization by Denoising (RED) and Plug-and-Play Prior (RED) are used in image processing.
While both have shown state-of-the-art results in various recovery tasks, their theoretical justification is incomplete.
arXiv Detail & Related papers (2020-08-01T09:35:22Z) - The Power of Triply Complementary Priors for Image Compressive Sensing [89.14144796591685]
We propose a joint low-rank deep (LRD) image model, which contains a pair of complementaryly trip priors.
We then propose a novel hybrid plug-and-play framework based on the LRD model for image CS.
To make the optimization tractable, a simple yet effective algorithm is proposed to solve the proposed H-based image CS problem.
arXiv Detail & Related papers (2020-05-16T08:17:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.