Solving Inverse Problems with Hybrid Deep Image Priors: the challenge of
preventing overfitting
- URL: http://arxiv.org/abs/2011.01748v2
- Date: Sun, 21 Feb 2021 17:15:25 GMT
- Title: Solving Inverse Problems with Hybrid Deep Image Priors: the challenge of
preventing overfitting
- Authors: Zhaodong Sun
- Abstract summary: We analyze and solve the overfitting problem of deep image prior (DIP)
Due to the large number of parameters of the neural network and noisy data, DIP overfits to the noise in the image as the number of iterations grows.
In the thesis, we use hybrid deep image priors to avoid overfitting.
- Score: 1.52292571922932
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We mainly analyze and solve the overfitting problem of deep image prior
(DIP). Deep image prior can solve inverse problems such as super-resolution,
inpainting and denoising. The main advantage of DIP over other deep learning
approaches is that it does not need access to a large dataset. However, due to
the large number of parameters of the neural network and noisy data, DIP
overfits to the noise in the image as the number of iterations grows. In the
thesis, we use hybrid deep image priors to avoid overfitting. The hybrid priors
are to combine DIP with an explicit prior such as total variation or with an
implicit prior such as a denoising algorithm. We use the alternating direction
method-of-multipliers (ADMM) to incorporate the new prior and try different
forms of ADMM to avoid extra computation caused by the inner loop of ADMM
steps. We also study the relation between the dynamics of gradient descent, and
the overfitting phenomenon. The numerical results show the hybrid priors play
an important role in preventing overfitting. Besides, we try to fit the image
along some directions and find this method can reduce overfitting when the
noise level is large. When the noise level is small, it does not considerably
reduce the overfitting problem.
Related papers
- Chasing Better Deep Image Priors between Over- and Under-parameterization [63.8954152220162]
We study a novel "lottery image prior" (LIP) by exploiting DNN inherent sparsity.
LIPworks significantly outperform deep decoders under comparably compact model sizes.
We also extend LIP to compressive sensing image reconstruction, where a pre-trained GAN generator is used as the prior.
arXiv Detail & Related papers (2024-10-31T17:49:44Z) - Score Priors Guided Deep Variational Inference for Unsupervised
Real-World Single Image Denoising [14.486289176696438]
We propose a score priors-guided deep variational inference, namely ScoreDVI, for practical real-world denoising.
We exploit a Non-$i.i.d$ Gaussian mixture model and variational noise posterior to model the real-world noise.
Our method outperforms other single image-based real-world denoising methods and achieves comparable performance to dataset-based unsupervised methods.
arXiv Detail & Related papers (2023-08-09T03:26:58Z) - DDGM: Solving inverse problems by Diffusive Denoising of Gradient-based
Minimization [4.209801809583906]
A recent trend is to train a convolutional net to denoise images, and use this net as a prior when solving the inverse problem.
Here we propose a simpler approach that combines the traditional gradient-based minimization of reconstruction error with denoising.
We show that high accuracy can be achieved with as few as 50 denoising steps.
arXiv Detail & Related papers (2023-07-11T00:21:38Z) - Poisson-Gaussian Holographic Phase Retrieval with Score-based Image
Prior [19.231581775644617]
We propose a new algorithm called "AWFS" that uses the accelerated Wirtinger flow (AWF) with a score function as generative prior.
We calculate the gradient of the log-likelihood function for PR and determine the Lipschitz constant.
We provide theoretical analysis that establishes a critical-point convergence guarantee for the proposed algorithm.
arXiv Detail & Related papers (2023-05-12T18:08:47Z) - Tuning-free Plug-and-Play Hyperspectral Image Deconvolution with Deep
Priors [6.0622962428871885]
We introduce a tuning-free Plug-and-Play (Play) algorithm for HSI deconvolution.
Specifically, we use the alternating direction multipliers (ADMM) to decompose the problem into two iterative sub-problems.
A flexible blind 3D denoising network (B3DDN) is designed to learn deep priors and to solve the denoising sub-problem with different noise levels.
arXiv Detail & Related papers (2022-11-28T13:41:14Z) - Preconditioned Plug-and-Play ADMM with Locally Adjustable Denoiser for
Image Restoration [54.23646128082018]
We extend the concept of plug-and-play optimization to use denoisers that can be parameterized for non-constant noise variance.
We show that our pixel-wise adjustable denoiser, along with a suitable preconditioning strategy, can further improve the plug-and-play ADMM approach for several applications.
arXiv Detail & Related papers (2021-10-01T15:46:35Z) - Rethinking Deep Image Prior for Denoising [23.140599133203292]
We analyze the DIP by the notion of effective degrees of freedom (DF) to monitor the optimization progress.
We propose a principled stopping criterion before fitting to noise without access of a paired ground truth image for Gaussian noise.
Our approach outperforms prior arts in LPIPS by large margins with comparable PSNR and SSIM on seven different datasets.
arXiv Detail & Related papers (2021-08-29T13:34:31Z) - On Measuring and Controlling the Spectral Bias of the Deep Image Prior [63.88575598930554]
The deep image prior has demonstrated the remarkable ability that untrained networks can address inverse imaging problems.
It requires an oracle to determine when to stop the optimization as the performance degrades after reaching a peak.
We study the deep image prior from a spectral bias perspective to address these problems.
arXiv Detail & Related papers (2021-07-02T15:10:42Z) - A Contrastive Learning Approach for Training Variational Autoencoder
Priors [137.62674958536712]
Variational autoencoders (VAEs) are one of the powerful likelihood-based generative models with applications in many domains.
One explanation for VAEs' poor generative quality is the prior hole problem: the prior distribution fails to match the aggregate approximate posterior.
We propose an energy-based prior defined by the product of a base prior distribution and a reweighting factor, designed to bring the base closer to the aggregate posterior.
arXiv Detail & Related papers (2020-10-06T17:59:02Z) - The Power of Triply Complementary Priors for Image Compressive Sensing [89.14144796591685]
We propose a joint low-rank deep (LRD) image model, which contains a pair of complementaryly trip priors.
We then propose a novel hybrid plug-and-play framework based on the LRD model for image CS.
To make the optimization tractable, a simple yet effective algorithm is proposed to solve the proposed H-based image CS problem.
arXiv Detail & Related papers (2020-05-16T08:17:44Z) - Data Augmentation for Histopathological Images Based on
Gaussian-Laplacian Pyramid Blending [59.91656519028334]
Data imbalance is a major problem that affects several machine learning (ML) algorithms.
In this paper, we propose a novel approach capable of not only augmenting HI dataset but also distributing the inter-patient variability.
Experimental results on the BreakHis dataset have shown promising gains vis-a-vis the majority of DA techniques presented in the literature.
arXiv Detail & Related papers (2020-01-31T22:02:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.