A Data-driven Loss Weighting Scheme across Heterogeneous Tasks for Image Denoising
- URL: http://arxiv.org/abs/2301.06081v3
- Date: Tue, 12 Aug 2025 01:44:02 GMT
- Title: A Data-driven Loss Weighting Scheme across Heterogeneous Tasks for Image Denoising
- Authors: Xiangyu Rui, Xiangyong Cao, Xile Zhao, Deyu Meng, Michael K. NG,
- Abstract summary: In variational denoising models, weight in the data fidelity term plays the role of enhancing the noise-removal capability.<n>In this work, we propose a data-driven loss weighting scheme to address these issues.<n> Numerical results verify the remarkable performance of DLW on improving the ability of various variational denoising models to handle different complex noise.
- Score: 67.02529586335473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In a variational denoising model, weight in the data fidelity term plays the role of enhancing the noise-removal capability. It is profoundly correlated with noise information, while also balancing the data fidelity and regularization terms. However, the difficulty of assigning weight is expected to be substantial when the noise pattern is beyond independent identical Gaussian distribution, e.g., impulse noise, stripe noise, or a mixture of several patterns, etc. Furthermore, how to leverage weight to balance the data fidelity and regularization terms is even less evident. In this work, we propose a data-driven loss weighting (DLW) scheme to address these issues. Specifically, DLW trains a parameterized weight function (i.e., a neural network) that maps the noisy image to the weight. The training is achieved by a bilevel optimization framework, where the lower level problem is solving several denoising models with the same weight predicted by the weight function and the upper level problem minimizes the distance between the restored image and the clean image. In this way, information from both the noise and the regularization can be efficiently extracted to determine the weight function. DLW also facilitates the easy implementation of a trained weight function on denoising models. Numerical results verify the remarkable performance of DLW on improving the ability of various variational denoising models to handle different complex noise. This implies that DLW has the ability to transfer the noise knowledge at the model level to heterogeneous tasks beyond the training ones and the generalization theory underlying DLW is studied, validating its intrinsic transferability.
Related papers
- DenoGrad: Deep Gradient Denoising Framework for Enhancing the Performance of Interpretable AI Models [3.189189590825304]
We propose a novel instance Denoiser framework, DenoGrad, to detect and adjust noisy samples.<n>DenoGrad dynamically corrects noisy instances, preserving problem's data distribution, and improving AI models.
arXiv Detail & Related papers (2025-11-13T10:16:02Z) - Mitigating the Noise Shift for Denoising Generative Models via Noise Awareness Guidance [54.88271057438763]
Noise Awareness Guidance (NAG) is a correction method that explicitly steers sampling trajectories to remain consistent with the pre-defined noise schedule.<n>NAG consistently mitigates noise shift and substantially improves the generation quality of mainstream diffusion models.
arXiv Detail & Related papers (2025-10-14T13:31:34Z) - Noise Conditional Variational Score Distillation [60.38982038894823]
Noise Conditional Variational Score Distillation (NCVSD) is a novel method for distilling pretrained diffusion models into generative denoisers.<n>By integrating this insight into the Variational Score Distillation framework, we enable scalable learning of generative denoisers.
arXiv Detail & Related papers (2025-06-11T06:01:39Z) - Noise Augmented Fine Tuning for Mitigating Hallucinations in Large Language Models [1.0579965347526206]
Large language models (LLMs) often produce inaccurate or misleading content-hallucinations.
Noise-Augmented Fine-Tuning (NoiseFiT) is a novel framework that leverages adaptive noise injection to enhance model robustness.
NoiseFiT selectively perturbs layers identified as either high-SNR (more robust) or low-SNR (potentially under-regularized) using a dynamically scaled Gaussian noise.
arXiv Detail & Related papers (2025-04-04T09:27:19Z) - Enhance Vision-Language Alignment with Noise [59.2608298578913]
We investigate whether the frozen model can be fine-tuned by customized noise.
We propose Positive-incentive Noise (PiNI) which can fine-tune CLIP via injecting noise into both visual and text encoders.
arXiv Detail & Related papers (2024-12-14T12:58:15Z) - Pan-denoising: Guided Hyperspectral Image Denoising via Weighted Represent Coefficient Total Variation [20.240211073097758]
This paper introduces a novel paradigm for hyperspectral image (HSI) denoising, which is termed textitpan-denoising.
Panchromatic (PAN) images capture similar structures and textures to HSIs but with less noise. Consequently, pan-denoising has the potential to uncover underlying structures and details beyond the internal information modeling of traditional HSI denoising methods.
Experiments on synthetic and real-world datasets demonstrate that PWRCTV outperforms several state-of-the-art methods in terms of metrics and visual quality.
arXiv Detail & Related papers (2024-07-08T16:05:56Z) - Physics-guided Noise Neural Proxy for Practical Low-light Raw Image
Denoising [22.11250276261829]
Recently, the mainstream practice for training low-light raw image denoising has shifted towards employing synthetic data.
Noise modeling, which focuses on characterizing the noise distribution of real-world sensors, profoundly influences the effectiveness and practicality of synthetic data.
We propose a novel strategy: learning the noise model from dark frames instead of paired real data, to break down the data dependency.
arXiv Detail & Related papers (2023-10-13T14:14:43Z) - Stimulating Diffusion Model for Image Denoising via Adaptive Embedding and Ensembling [56.506240377714754]
We present a novel strategy called the Diffusion Model for Image Denoising (DMID)
Our strategy includes an adaptive embedding method that embeds the noisy image into a pre-trained unconditional diffusion model.
Our DMID strategy achieves state-of-the-art performance on both distortion-based and perception-based metrics.
arXiv Detail & Related papers (2023-07-08T14:59:41Z) - Realistic Noise Synthesis with Diffusion Models [68.48859665320828]
Deep image denoising models often rely on large amount of training data for the high quality performance.
We propose a novel method that synthesizes realistic noise using diffusion models, namely Realistic Noise Synthesize Diffusor (RNSD)
RNSD can incorporate guided multiscale content, such as more realistic noise with spatial correlations can be generated at multiple frequencies.
arXiv Detail & Related papers (2023-05-23T12:56:01Z) - Advancing Unsupervised Low-light Image Enhancement: Noise Estimation, Illumination Interpolation, and Self-Regulation [55.07472635587852]
Low-Light Image Enhancement (LLIE) techniques have made notable advancements in preserving image details and enhancing contrast.
These approaches encounter persistent challenges in efficiently mitigating dynamic noise and accommodating diverse low-light scenarios.
We first propose a method for estimating the noise level in low light images in a quick and accurate way.
We then devise a Learnable Illumination Interpolator (LII) to satisfy general constraints between illumination and input.
arXiv Detail & Related papers (2023-05-17T13:56:48Z) - Degradation-Noise-Aware Deep Unfolding Transformer for Hyperspectral
Image Denoising [9.119226249676501]
Hyperspectral images (HSIs) are often quite noisy because of narrow band spectral filtering.
To reduce the noise in HSI data cubes, both model-driven and learning-based denoising algorithms have been proposed.
This paper proposes a Degradation-Noise-Aware Unfolding Network (DNA-Net) that addresses these issues.
arXiv Detail & Related papers (2023-05-06T13:28:20Z) - Treatment Learning Causal Transformer for Noisy Image Classification [62.639851972495094]
In this work, we incorporate this binary information of "existence of noise" as treatment into image classification tasks to improve prediction accuracy.
Motivated from causal variational inference, we propose a transformer-based architecture, that uses a latent generative model to estimate robust feature representations for noise image classification.
We also create new noisy image datasets incorporating a wide range of noise factors for performance benchmarking.
arXiv Detail & Related papers (2022-03-29T13:07:53Z) - Denoising Distantly Supervised Named Entity Recognition via a
Hypergeometric Probabilistic Model [26.76830553508229]
Hypergeometric Learning (HGL) is a denoising algorithm for distantly supervised named entity recognition.
HGL takes both noise distribution and instance-level confidence into consideration.
Experiments show that HGL can effectively denoise the weakly-labeled data retrieved from distant supervision.
arXiv Detail & Related papers (2021-06-17T04:01:25Z) - Noise Reduction in X-ray Photon Correlation Spectroscopy with
Convolutional Neural Networks Encoder-Decoder Models [0.0]
We propose a computational approach for improving the signal-to-noise ratio in two-time correlation functions.
CNN-ED models are based on Convolutional Neural Network-Decoder (CNN-ED) models.
We demonstrate that the CNN-ED models trained on real-world experimental data help to effectively extract equilibrium dynamics parameters from two-time correlation functions.
arXiv Detail & Related papers (2021-02-07T18:38:59Z) - SMDS-Net: Model Guided Spectral-Spatial Network for Hyperspectral Image
Denoising [10.597014770267672]
Deep learning (DL) based hyperspectral images (HSIs) denoising approaches directly learn the nonlinear mapping between observed noisy images and underlying clean images.
We introduce a novel model guided interpretable network for HSI denoising.
arXiv Detail & Related papers (2020-12-03T11:05:01Z) - Noise2Same: Optimizing A Self-Supervised Bound for Image Denoising [54.730707387866076]
We introduce Noise2Same, a novel self-supervised denoising framework.
In particular, Noise2Same requires neither J-invariance nor extra information about the noise model.
Our results show that our Noise2Same remarkably outperforms previous self-supervised denoising methods.
arXiv Detail & Related papers (2020-10-22T18:12:26Z) - Evolving Deep Convolutional Neural Networks for Hyperspectral Image
Denoising [6.869192200282213]
We propose a novel algorithm to automatically build an optimal Convolutional Neural Network (CNN) to effectively denoise HSIs.
The experiments of the proposed algorithm have been well-designed and compared against the state-of-the-art peer competitors.
arXiv Detail & Related papers (2020-08-15T03:04:11Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z) - Variational Denoising Network: Toward Blind Noise Modeling and Removal [59.36166491196973]
Blind image denoising is an important yet very challenging problem in computer vision.
We propose a new variational inference method, which integrates both noise estimation and image denoising.
arXiv Detail & Related papers (2019-08-29T15:54:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.