Mixed Noise Removal with Pareto Prior
- URL: http://arxiv.org/abs/2008.11935v1
- Date: Thu, 27 Aug 2020 06:35:15 GMT
- Title: Mixed Noise Removal with Pareto Prior
- Authors: Zhou Liu, Lei Yu, Gui-Song Xia, Hong Sun
- Abstract summary: Denoising images contaminated by the mixture of additive white Gaussian noise (AWGN) and noise impulse (IN) is an essential but challenging problem.
Existing methods target to compensate the effects of IN by introducing a weighting matrix.
We propose an accurate and robust weight estimator for mixed noise removal based on the priori of the weighting matrix.
- Score: 31.306541517685137
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Denoising images contaminated by the mixture of additive white Gaussian noise
(AWGN) and impulse noise (IN) is an essential but challenging problem. The
presence of impulsive disturbances inevitably affects the distribution of
noises and thus largely degrades the performance of traditional AWGN denoisers.
Existing methods target to compensate the effects of IN by introducing a
weighting matrix, which, however, is lack of proper priori and thus hard to be
accurately estimated. To address this problem, we exploit the Pareto
distribution as the priori of the weighting matrix, based on which an accurate
and robust weight estimator is proposed for mixed noise removal. Particularly,
a relatively small portion of pixels are assumed to be contaminated with IN,
which should have weights with small values and then be penalized out. This
phenomenon can be properly described by the Pareto distribution of type 1.
Therefore, armed with the Pareto distribution, we formulate the problem of
mixed noise removal in the Bayesian framework, where nonlocal self-similarity
priori is further exploited by adopting nonlocal low rank approximation.
Compared to existing methods, the proposed method can estimate the weighting
matrix adaptively, accurately, and robust for different level of noises, thus
can boost the denoising performance. Experimental results on widely used image
datasets demonstrate the superiority of our proposed method to the
state-of-the-arts.
Related papers
- NoiseDiffusion: Correcting Noise for Image Interpolation with Diffusion Models beyond Spherical Linear Interpolation [86.7260950382448]
We propose a novel approach to correct noise for image validity, NoiseDiffusion.
NoiseDiffusion performs within the noisy image space and injects raw images into these noisy counterparts to address the challenge of information loss.
arXiv Detail & Related papers (2024-03-13T12:32:25Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - High-Order Qubit Dephasing at Sweet Spots by Non-Gaussian Fluctuators:
Symmetry Breaking and Floquet Protection [55.41644538483948]
We study the qubit dephasing caused by the non-Gaussian fluctuators.
We predict a symmetry-breaking effect that is unique to the non-Gaussian noise.
arXiv Detail & Related papers (2022-06-06T18:02:38Z) - The price of ignorance: how much does it cost to forget noise structure
in low-rank matrix estimation? [21.3083877172595]
We consider the problem of estimating a rank-1 signal corrupted by structured rotationally invariant noise.
We make a step towards understanding the effect of the strong source of mismatch which is the noise statistics.
We show that this performance gap is due to an incorrect estimation of the signal norm.
arXiv Detail & Related papers (2022-05-20T07:54:21Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - FINO: Flow-based Joint Image and Noise Model [23.9749061109964]
Flow-based joint Image and NOise model (FINO)
We propose a novel Flow-based joint Image and NOise model (FINO) that distinctly decouples the image and noise in the latent space and losslessly reconstructs them via a series of invertible transformations.
arXiv Detail & Related papers (2021-11-11T02:51:54Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z) - Residual-driven Fuzzy C-Means Clustering for Image Segmentation [152.609322951917]
We elaborate on residual-driven Fuzzy C-Means (FCM) for image segmentation.
Built on this framework, we present a weighted $ell_2$-norm fidelity term by weighting mixed noise distribution.
The results demonstrate the superior effectiveness and efficiency of the proposed algorithm over existing FCM-related algorithms.
arXiv Detail & Related papers (2020-04-15T15:46:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.