Supervised Learning of Sparsity-Promoting Regularizers for Denoising
- URL: http://arxiv.org/abs/2006.05521v1
- Date: Tue, 9 Jun 2020 21:38:05 GMT
- Title: Supervised Learning of Sparsity-Promoting Regularizers for Denoising
- Authors: Michael T. McCann, Saiprasad Ravishankar
- Abstract summary: We present a method for supervised learning of sparsity-promoting regularizers for image denoising.
Our experiments show that the proposed method can learn an operator that outperforms well-known regularizers.
- Score: 13.203765985718205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a method for supervised learning of sparsity-promoting
regularizers for image denoising. Sparsity-promoting regularization is a key
ingredient in solving modern image reconstruction problems; however, the
operators underlying these regularizers are usually either designed by hand or
learned from data in an unsupervised way. The recent success of supervised
learning (mainly convolutional neural networks) in solving image reconstruction
problems suggests that it could be a fruitful approach to designing
regularizers. As a first experiment in this direction, we propose to denoise
images using a variational formulation with a parametric, sparsity-promoting
regularizer, where the parameters of the regularizer are learned to minimize
the mean squared error of reconstructions on a training set of (ground truth
image, measurement) pairs. Training involves solving a challenging bilievel
optimization problem; we derive an expression for the gradient of the training
loss using Karush-Kuhn-Tucker conditions and provide an accompanying gradient
descent algorithm to minimize it. Our experiments on a simple synthetic,
denoising problem show that the proposed method can learn an operator that
outperforms well-known regularizers (total variation, DCT-sparsity, and
unsupervised dictionary learning) and collaborative filtering. While the
approach we present is specific to denoising, we believe that it can be adapted
to the whole class of inverse problems with linear measurement models, giving
it applicability to a wide range of image reconstruction problems.
Related papers
- Learning Weakly Convex Regularizers for Convergent Image-Reconstruction
Algorithms [16.78532039510369]
We propose to learn non-processing regularizers with a bound on their weak energy.
We show that they mimic convexs-regularity-promoting regularizers.
We also show that the learned regularizer can be deployed to solve inverse problems with schemes provably converge.
arXiv Detail & Related papers (2023-08-21T07:52:39Z) - Masked Image Training for Generalizable Deep Image Denoising [53.03126421917465]
We present a novel approach to enhance the generalization performance of denoising networks.
Our method involves masking random pixels of the input image and reconstructing the missing information during training.
Our approach exhibits better generalization ability than other deep learning models and is directly applicable to real-world scenarios.
arXiv Detail & Related papers (2023-03-23T09:33:44Z) - Learning Sparsity-Promoting Regularizers using Bilevel Optimization [9.18465987536469]
We present a method for supervised learning of sparsity-promoting regularizers for denoising signals and images.
Experiments with structured 1D signals and natural images show that the proposed method can learn an operator that outperforms well-known regularizers.
arXiv Detail & Related papers (2022-07-18T20:50:02Z) - PatchNR: Learning from Small Data by Patch Normalizing Flow
Regularization [57.37911115888587]
We introduce a regularizer for the variational modeling of inverse problems in imaging based on normalizing flows.
Our regularizer, called patchNR, involves a normalizing flow learned on patches of very few images.
arXiv Detail & Related papers (2022-05-24T12:14:26Z) - Denoising Diffusion Restoration Models [110.1244240726802]
Denoising Diffusion Restoration Models (DDRM) is an efficient, unsupervised posterior sampling method.
We demonstrate DDRM's versatility on several image datasets for super-resolution, deblurring, inpainting, and colorization.
arXiv Detail & Related papers (2022-01-27T20:19:07Z) - Bilevel learning of l1-regularizers with closed-form gradients(BLORC) [8.138650738423722]
We present a method for supervised learning of sparsity-promoting regularizers.
The parameters are learned to minimize the mean squared error of reconstruction on a training set of ground truth signal and measurement pairs.
arXiv Detail & Related papers (2021-11-21T17:01:29Z) - Deep Variational Network Toward Blind Image Restoration [60.45350399661175]
Blind image restoration is a common yet challenging problem in computer vision.
We propose a novel blind image restoration method, aiming to integrate both the advantages of them.
Experiments on two typical blind IR tasks, namely image denoising and super-resolution, demonstrate that the proposed method achieves superior performance over current state-of-the-arts.
arXiv Detail & Related papers (2020-08-25T03:30:53Z) - Learned convex regularizers for inverse problems [3.294199808987679]
We propose to learn a data-adaptive input- neural network (ICNN) as a regularizer for inverse problems.
We prove the existence of a sub-gradient-based algorithm that leads to a monotonically decreasing error in the parameter space with iterations.
We show that the proposed convex regularizer is at least competitive with and sometimes superior to state-of-the-art data-driven techniques for inverse problems.
arXiv Detail & Related papers (2020-08-06T18:58:35Z) - Solving Linear Inverse Problems Using the Prior Implicit in a Denoiser [7.7288480250888]
We develop a robust and general methodology for making use of implicit priors in deep neural networks.
A CNN trained to perform blind (i.e., with unknown noise level) least-squares denoising is presented.
A generalization of this algorithm to constrained sampling provides a method for using the implicit prior to solve any linear inverse problem.
arXiv Detail & Related papers (2020-07-27T15:40:46Z) - Fully Unsupervised Diversity Denoising with Convolutional Variational
Autoencoders [81.30960319178725]
We propose DivNoising, a denoising approach based on fully convolutional variational autoencoders (VAEs)
First we introduce a principled way of formulating the unsupervised denoising problem within the VAE framework by explicitly incorporating imaging noise models into the decoder.
We show that such a noise model can either be measured, bootstrapped from noisy data, or co-learned during training.
arXiv Detail & Related papers (2020-06-10T21:28:13Z) - Total Deep Variation for Linear Inverse Problems [71.90933869570914]
We propose a novel learnable general-purpose regularizer exploiting recent architectural design patterns from deep learning.
We show state-of-the-art performance for classical image restoration and medical image reconstruction problems.
arXiv Detail & Related papers (2020-01-14T19:01:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.