Boosting of Implicit Neural Representation-based Image Denoiser
- URL: http://arxiv.org/abs/2401.01548v1
- Date: Wed, 3 Jan 2024 05:51:25 GMT
- Title: Boosting of Implicit Neural Representation-based Image Denoiser
- Authors: Zipei Yan, Zhengji Liu, Jizhou Li
- Abstract summary: Implicit Neural Representation (INR) has emerged as an effective method for unsupervised image denoising.
We propose a general recipe for regularizing INR models in image denoising.
- Score: 2.2452191187045383
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Implicit Neural Representation (INR) has emerged as an effective method for
unsupervised image denoising. However, INR models are typically
overparameterized; consequently, these models are prone to overfitting during
learning, resulting in suboptimal results, even noisy ones. To tackle this
problem, we propose a general recipe for regularizing INR models in image
denoising. In detail, we propose to iteratively substitute the supervision
signal with the mean value derived from both the prediction and supervision
signal during the learning process. We theoretically prove that such a simple
iterative substitute can gradually enhance the signal-to-noise ratio of the
supervision signal, thereby benefiting INR models during the learning process.
Our experimental results demonstrate that INR models can be effectively
regularized by the proposed approach, relieving overfitting and boosting image
denoising performance.
Related papers
- Random Sub-Samples Generation for Self-Supervised Real Image Denoising [9.459398471988724]
We propose a novel self-supervised real image denoising framework named Sampling Difference As Perturbation (SDAP)
We find that adding an appropriate perturbation to the training images can effectively improve the performance of BSN.
The results show that it significantly outperforms other state-of-the-art self-supervised denoising methods on real-world datasets.
arXiv Detail & Related papers (2023-07-31T16:39:35Z) - ACDMSR: Accelerated Conditional Diffusion Models for Single Image
Super-Resolution [84.73658185158222]
We propose a diffusion model-based super-resolution method called ACDMSR.
Our method adapts the standard diffusion model to perform super-resolution through a deterministic iterative denoising process.
Our approach generates more visually realistic counterparts for low-resolution images, emphasizing its effectiveness in practical scenarios.
arXiv Detail & Related papers (2023-07-03T06:49:04Z) - Enhancing convolutional neural network generalizability via low-rank weight approximation [6.763245393373041]
Sufficient denoising is often an important first step for image processing.
Deep neural networks (DNNs) have been widely used for image denoising.
We introduce a new self-supervised framework for image denoising based on the Tucker low-rank tensor approximation.
arXiv Detail & Related papers (2022-09-26T14:11:05Z) - SAR Despeckling using a Denoising Diffusion Probabilistic Model [52.25981472415249]
The presence of speckle degrades the image quality and adversely affects the performance of SAR image understanding applications.
We introduce SAR-DDPM, a denoising diffusion probabilistic model for SAR despeckling.
The proposed method achieves significant improvements in both quantitative and qualitative results over the state-of-the-art despeckling methods.
arXiv Detail & Related papers (2022-06-09T14:00:26Z) - Poisson2Sparse: Self-Supervised Poisson Denoising From a Single Image [34.27748767631027]
We present a novel self-supervised learning method for single-image denoising.
We approximate traditional iterative optimization algorithms for image denoising with a recurrent neural network.
Our method outperforms the state-of-the-art approaches in terms of PSNR and SSIM.
arXiv Detail & Related papers (2022-06-04T00:08:58Z) - Zero-shot Blind Image Denoising via Implicit Neural Representations [77.79032012459243]
We propose an alternative denoising strategy that leverages the architectural inductive bias of implicit neural representations (INRs)
We show that our method outperforms existing zero-shot denoising methods under an extensive set of low-noise or real-noise scenarios.
arXiv Detail & Related papers (2022-04-05T12:46:36Z) - Learning Spatial and Spatio-Temporal Pixel Aggregations for Image and
Video Denoising [104.59305271099967]
We present a pixel aggregation network and learn the pixel sampling and averaging strategies for image denoising.
We develop a pixel aggregation network for video denoising to sample pixels across the spatial-temporal space.
Our method is able to solve the misalignment issues caused by large motion in dynamic scenes.
arXiv Detail & Related papers (2021-01-26T13:00:46Z) - Image Denoising using Attention-Residual Convolutional Neural Networks [0.0]
We propose a new learning-based non-blind denoising technique named Attention Residual Convolutional Neural Network (ARCNN) and its extension to blind denoising named Flexible Attention Residual Convolutional Neural Network (FARCNN)
ARCNN achieved an overall average PSNR results of around 0.44dB and 0.96dB for Gaussian and Poisson denoising, respectively FARCNN presented very consistent results, even with slightly worsen performance compared to ARCNN.
arXiv Detail & Related papers (2021-01-19T16:37:57Z) - Noise2Same: Optimizing A Self-Supervised Bound for Image Denoising [54.730707387866076]
We introduce Noise2Same, a novel self-supervised denoising framework.
In particular, Noise2Same requires neither J-invariance nor extra information about the noise model.
Our results show that our Noise2Same remarkably outperforms previous self-supervised denoising methods.
arXiv Detail & Related papers (2020-10-22T18:12:26Z) - Variational Denoising Network: Toward Blind Noise Modeling and Removal [59.36166491196973]
Blind image denoising is an important yet very challenging problem in computer vision.
We propose a new variational inference method, which integrates both noise estimation and image denoising.
arXiv Detail & Related papers (2019-08-29T15:54:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.