Denoising: from classical methods to deep CNNs
- URL: http://arxiv.org/abs/2404.16617v2
- Date: Sat, 27 Apr 2024 09:29:38 GMT
- Title: Denoising: from classical methods to deep CNNs
- Authors: Jean-Eric Campagne,
- Abstract summary: We briefly review classical methods such as Fourier analysis and wavelet bases, highlighting the challenges they faced until the emergence of neural networks.
We discuss the prerequisites for genuine learning of probability densities, offering insights that extend from mathematical research to the implications of universal structures.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper aims to explore the evolution of image denoising in a pedagological way. We briefly review classical methods such as Fourier analysis and wavelet bases, highlighting the challenges they faced until the emergence of neural networks, notably the U-Net, in the 2010s. The remarkable performance of these networks has been demonstrated in studies such as Kadkhodaie et al. (2024). They exhibit adaptability to various image types, including those with fixed regularity, facial images, and bedroom scenes, achieving optimal results and biased towards geometry-adaptive harmonic basis. The introduction of score diffusion has played a crucial role in image generation. In this context, denoising becomes essential as it facilitates the estimation of probability density scores. We discuss the prerequisites for genuine learning of probability densities, offering insights that extend from mathematical research to the implications of universal structures.
Related papers
- Policy Gradient-Driven Noise Mask [3.69758875412828]
We propose a novel pretraining pipeline that learns to generate conditional noise masks specifically tailored to improve performance on multi-modal and multi-organ datasets.
A key aspect is that the policy network's role is limited to obtaining an intermediate (or heated) model before fine-tuning.
Results demonstrate that fine-tuning the intermediate models consistently outperforms conventional training algorithms on both classification and generalization to unseen concept tasks.
arXiv Detail & Related papers (2024-04-29T23:53:42Z) - Generalization in diffusion models arises from geometry-adaptive harmonic representations [21.384922363202335]
Deep neural networks (DNNs) trained for image denoising are able to generate high-quality samples with score-based reverse diffusion algorithms.
Recent reports of memorization of the training set raise the question of whether these networks are learning the "true" continuous density of the data.
We show that two DNNs trained on non-overlapping subsets of a dataset learn nearly the same score function, and thus the same density, when the number of training images is large enough.
arXiv Detail & Related papers (2023-10-04T03:30:32Z) - Unleashing the Power of Self-Supervised Image Denoising: A Comprehensive Review [7.387921606240273]
The advent of deep learning has brought a revolutionary transformation to image denoising techniques.
The persistent challenge of acquiring noise-clean pairs for supervised methods in real-world scenarios remains formidable.
This paper focuses on self-supervised image denoising methods that offer effective solutions.
arXiv Detail & Related papers (2023-08-01T03:00:36Z) - Masked Image Training for Generalizable Deep Image Denoising [53.03126421917465]
We present a novel approach to enhance the generalization performance of denoising networks.
Our method involves masking random pixels of the input image and reconstructing the missing information during training.
Our approach exhibits better generalization ability than other deep learning models and is directly applicable to real-world scenarios.
arXiv Detail & Related papers (2023-03-23T09:33:44Z) - Unsupervised Domain Transfer with Conditional Invertible Neural Networks [83.90291882730925]
We propose a domain transfer approach based on conditional invertible neural networks (cINNs)
Our method inherently guarantees cycle consistency through its invertible architecture, and network training can efficiently be conducted with maximum likelihood.
Our method enables the generation of realistic spectral data and outperforms the state of the art on two downstream classification tasks.
arXiv Detail & Related papers (2023-03-17T18:00:27Z) - Image Embedding for Denoising Generative Models [0.0]
We focus on Denoising Diffusion Implicit Models due to the deterministic nature of their reverse diffusion process.
As a side result of our investigation, we gain a deeper insight into the structure of the latent space of diffusion models.
arXiv Detail & Related papers (2022-12-30T17:56:07Z) - Deep Semantic Statistics Matching (D2SM) Denoising Network [70.01091467628068]
We introduce the Deep Semantic Statistics Matching (D2SM) Denoising Network.
It exploits semantic features of pretrained classification networks, then it implicitly matches the probabilistic distribution of clear images at the semantic feature space.
By learning to preserve the semantic distribution of denoised images, we empirically find our method significantly improves the denoising capabilities of networks.
arXiv Detail & Related papers (2022-07-19T14:35:42Z) - Amplitude-Phase Recombination: Rethinking Robustness of Convolutional
Neural Networks in Frequency Domain [31.182376196295365]
CNN tends to converge at the local optimum which is closely related to the high-frequency components of the training images.
A new perspective on data augmentation designed by re-combing the phase spectrum of the current image and the amplitude spectrum of the distracter image.
arXiv Detail & Related papers (2021-08-19T04:04:41Z) - Unravelling the Effect of Image Distortions for Biased Prediction of
Pre-trained Face Recognition Models [86.79402670904338]
We evaluate the performance of four state-of-the-art deep face recognition models in the presence of image distortions.
We have observed that image distortions have a relationship with the performance gap of the model across different subgroups.
arXiv Detail & Related papers (2021-08-14T16:49:05Z) - Deep Learning on Image Denoising: An overview [92.07378559622889]
We offer a comparative study of deep techniques in image denoising.
We first classify the deep convolutional neural networks (CNNs) for additive white noisy images.
Next, we compare the state-of-the-art methods on public denoising datasets in terms of quantitative and qualitative analysis.
arXiv Detail & Related papers (2019-12-31T05:03:57Z) - Variational Denoising Network: Toward Blind Noise Modeling and Removal [59.36166491196973]
Blind image denoising is an important yet very challenging problem in computer vision.
We propose a new variational inference method, which integrates both noise estimation and image denoising.
arXiv Detail & Related papers (2019-08-29T15:54:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.