Transforming Noise Distributions with Histogram Matching: Towards a Single Denoiser for All
- URL: http://arxiv.org/abs/2510.06757v1
- Date: Wed, 08 Oct 2025 08:34:50 GMT
- Title: Transforming Noise Distributions with Histogram Matching: Towards a Single Denoiser for All
- Authors: Sheng Fu, Junchao Zhang, Kailun Yang,
- Abstract summary: Supervised Gaussian denoisers exhibit limited generalization when confronted with out-of-distribution noise.<n>We propose a histogram matching approach that transforms arbitrary noise towards a target Gaussian distribution with known intensity.
- Score: 11.736212158463147
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Supervised Gaussian denoisers exhibit limited generalization when confronted with out-of-distribution noise, due to the diverse distributional characteristics of different noise types. To bridge this gap, we propose a histogram matching approach that transforms arbitrary noise towards a target Gaussian distribution with known intensity. Moreover, a mutually reinforcing cycle is established between noise transformation and subsequent denoising. This cycle progressively refines the noise to be converted, making it approximate the real noise, thereby enhancing the noise transformation effect and further improving the denoising performance. We tackle specific noise complexities: local histogram matching handles signal-dependent noise, intrapatch permutation processes channel-related noise, and frequency-domain histogram matching coupled with pixel-shuffle down-sampling breaks spatial correlation. By applying these transformations, a single Gaussian denoiser gains remarkable capability to handle various out-of-distribution noises, including synthetic noises such as Poisson, salt-and-pepper and repeating pattern noises, as well as complex real-world noises. Extensive experiments demonstrate the superior generalization and effectiveness of our method.
Related papers
- CARD: Correlation Aware Restoration with Diffusion [8.859116375276157]
Correlation Aware Restoration with Diffusion (CARD) is a training-free extension of DDRM that handles correlated Gaussian noise.<n>To emphasize the importance of addressing correlated noise, we contribute CIN-D, a novel correlated noise dataset captured across diverse illumination conditions.<n>CARD consistently outperforms existing methods across denoising, deblurring, and super-resolution tasks.
arXiv Detail & Related papers (2025-12-04T21:46:43Z) - Integrating Reweighted Least Squares with Plug-and-Play Diffusion Priors for Noisy Image Restoration [6.402777145722335]
We propose a plug-and-play image restoration framework based on generative diffusion priors for robust removal of general noise types, including impulse noise.<n> Experimental results on benchmark datasets demonstrate that the proposed method effectively removes non-Gaussian impulse noise and achieves superior restoration performance.
arXiv Detail & Related papers (2025-11-10T08:11:20Z) - Mitigating the Noise Shift for Denoising Generative Models via Noise Awareness Guidance [54.88271057438763]
Noise Awareness Guidance (NAG) is a correction method that explicitly steers sampling trajectories to remain consistent with the pre-defined noise schedule.<n>NAG consistently mitigates noise shift and substantially improves the generation quality of mainstream diffusion models.
arXiv Detail & Related papers (2025-10-14T13:31:34Z) - Noise Matters: Optimizing Matching Noise for Diffusion Classifiers [6.442738337380714]
We propose a novel Noise Optimization method to learn matching (i.e., good) noise for DCs: NoOp.<n>For frequency matching, NoOp first optimize one randomly parameterized noise.<n>For Spatial Matching, NoOp trains a Meta-Network that adopts an image as input outputs.
arXiv Detail & Related papers (2025-08-15T09:01:03Z) - Enhancing Sample Generation of Diffusion Models using Noise Level Correction [9.014666170540304]
We propose a novel method to enhance sample generation by aligning the estimated noise level with the true distance of noisy samples to the manifold.<n> Specifically, we introduce a noise level correction network, leveraging a pre-trained denoising network, to refine noise level estimates during the denoising process.<n> Experimental results demonstrate that our method significantly improves sample quality in both unconstrained and constrained generation scenarios.
arXiv Detail & Related papers (2024-12-07T01:19:14Z) - NoiseDiffusion: Correcting Noise for Image Interpolation with Diffusion Models beyond Spherical Linear Interpolation [86.7260950382448]
We propose a novel approach to correct noise for image validity, NoiseDiffusion.
NoiseDiffusion performs within the noisy image space and injects raw images into these noisy counterparts to address the challenge of information loss.
arXiv Detail & Related papers (2024-03-13T12:32:25Z) - Blue noise for diffusion models [50.99852321110366]
We introduce a novel and general class of diffusion models taking correlated noise within and across images into account.
Our framework allows introducing correlation across images within a single mini-batch to improve gradient flow.
We perform both qualitative and quantitative evaluations on a variety of datasets using our method.
arXiv Detail & Related papers (2024-02-07T14:59:25Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - NLIP: Noise-robust Language-Image Pre-training [95.13287735264937]
We propose a principled Noise-robust Language-Image Pre-training framework (NLIP) to stabilize pre-training via two schemes: noise-harmonization and noise-completion.
Our NLIP can alleviate the common noise effects during image-text pre-training in a more efficient way.
arXiv Detail & Related papers (2022-12-14T08:19:30Z) - CFNet: Conditional Filter Learning with Dynamic Noise Estimation for
Real Image Denoising [37.29552796977652]
This paper considers real noise approximated by heteroscedastic Gaussian/Poisson Gaussian distributions with in-camera signal processing pipelines.
We propose a novel conditional filter in which the optimal kernels for different feature positions can be adaptively inferred by local features from the image and the noise map.
Also, we bring the thought that alternatively performs noise estimation and non-blind denoising into CNN structure, which continuously updates noise prior to guide the iterative feature denoising.
arXiv Detail & Related papers (2022-11-26T14:28:54Z) - High-Order Qubit Dephasing at Sweet Spots by Non-Gaussian Fluctuators:
Symmetry Breaking and Floquet Protection [55.41644538483948]
We study the qubit dephasing caused by the non-Gaussian fluctuators.
We predict a symmetry-breaking effect that is unique to the non-Gaussian noise.
arXiv Detail & Related papers (2022-06-06T18:02:38Z) - C2N: Practical Generative Noise Modeling for Real-World Denoising [53.96391787869974]
We introduce a Clean-to-Noisy image generation framework, namely C2N, to imitate complex real-world noise without using paired examples.
We construct the noise generator in C2N accordingly with each component of real-world noise characteristics to express a wide range of noise accurately.
arXiv Detail & Related papers (2022-02-19T05:53:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.