Ground Truth Free Denoising by Optimal Transport
- URL: http://arxiv.org/abs/2007.01575v1
- Date: Fri, 3 Jul 2020 09:39:25 GMT
- Title: Ground Truth Free Denoising by Optimal Transport
- Authors: S\"oren Dittmer, Carola-Bibiane Sch\"onlieb, Peter Maass
- Abstract summary: We present a learned unsupervised denoising method for arbitrary types of data.
The training is solely based on samples of noisy data and examples of noise.
The method rests on a Wasserstein Generative Adversarial Network setting.
- Score: 2.5137859989323537
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a learned unsupervised denoising method for arbitrary types of
data, which we explore on images and one-dimensional signals. The training is
solely based on samples of noisy data and examples of noise, which --
critically -- do not need to come in pairs. We only need the assumption that
the noise is independent and additive (although we describe how this can be
extended). The method rests on a Wasserstein Generative Adversarial Network
setting, which utilizes two critics and one generator.
Related papers
- Unsupervised Denoising for Signal-Dependent and Row-Correlated Imaging Noise [54.0185721303932]
We present the first fully unsupervised deep learning-based denoiser capable of handling imaging noise that is row-correlated.
Our approach uses a Variational Autoencoder with a specially designed autoregressive decoder.
Our method does not require a pre-trained noise model and can be trained from scratch using unpaired noisy data.
arXiv Detail & Related papers (2023-10-11T20:48:20Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - Identifying Hard Noise in Long-Tailed Sample Distribution [76.16113794808001]
We introduce Noisy Long-Tailed Classification (NLT)
Most de-noising methods fail to identify the hard noises.
We design an iterative noisy learning framework called Hard-to-Easy (H2E)
arXiv Detail & Related papers (2022-07-27T09:03:03Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - The potential of self-supervised networks for random noise suppression
in seismic data [0.0]
Blind-spot networks are shown to be an efficient suppressor of random noise in seismic data.
Results are compared with two commonly used random denoising techniques: FX-deconvolution and Curvelet transform.
We believe this is just the beginning of utilising self-supervised learning in seismic applications.
arXiv Detail & Related papers (2021-09-15T14:57:43Z) - Joint self-supervised blind denoising and noise estimation [0.0]
Two neural networks jointly predict the clean signal and infer the noise distribution.
We show empirically with synthetic noisy data that our model captures the noise distribution efficiently.
arXiv Detail & Related papers (2021-02-16T08:37:47Z) - Neighbor2Neighbor: Self-Supervised Denoising from Single Noisy Images [98.82804259905478]
We present Neighbor2Neighbor to train an effective image denoising model with only noisy images.
In detail, input and target used to train a network are images sub-sampled from the same noisy image.
A denoising network is trained on sub-sampled training pairs generated in the first stage, with a proposed regularizer as additional loss for better performance.
arXiv Detail & Related papers (2021-01-08T02:03:25Z) - Noise2Kernel: Adaptive Self-Supervised Blind Denoising using a Dilated
Convolutional Kernel Architecture [3.796436257221662]
We propose a dilated convolutional network that satisfies an invariant property, allowing efficient kernel-based training without random masking.
We also propose an adaptive self-supervision loss to circumvent the requirement of zero-mean constraint, which is specifically effective in removing salt-and-pepper or hybrid noise.
arXiv Detail & Related papers (2020-12-07T12:13:17Z) - Adaptive noise imitation for image denoising [58.21456707617451]
We develop a new textbfadaptive noise imitation (ADANI) algorithm that can synthesize noisy data from naturally noisy images.
To produce realistic noise, a noise generator takes unpaired noisy/clean images as input, where the noisy image is a guide for noise generation.
Coupling the noisy data output from ADANI with the corresponding ground-truth, a denoising CNN is then trained in a fully-supervised manner.
arXiv Detail & Related papers (2020-11-30T02:49:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.