Positive-incentive Noise
- URL: http://arxiv.org/abs/2212.09541v1
- Date: Mon, 19 Dec 2022 15:33:34 GMT
- Title: Positive-incentive Noise
- Authors: Xuelong Li
- Abstract summary: Noise is conventionally viewed as a severe problem in diverse fields, e.g., engineering, learning systems.
This paper aims to investigate whether the conventional proposition always holds.
$pi$-noise offers new explanations for some models and provides a new principle for some fields, such as multi-task learning, adversarial training, etc.
- Score: 91.3755431537592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Noise is conventionally viewed as a severe problem in diverse fields, e.g.,
engineering, learning systems. However, this paper aims to investigate whether
the conventional proposition always holds. It begins with the definition of
task entropy, which extends from the information entropy and measures the
complexity of the task. After introducing the task entropy, the noise can be
classified into two kinds, Positive-incentive noise (Pi-noise or $\pi$-noise)
and pure noise, according to whether the noise can reduce the complexity of the
task. Interestingly, as shown theoretically and empirically, even the simple
random noise can be the $\pi$-noise that simplifies the task. $\pi$-noise
offers new explanations for some models and provides a new principle for some
fields, such as multi-task learning, adversarial training, etc. Moreover, it
reminds us to rethink the investigation of noises.
Related papers
- Data Augmentation of Contrastive Learning is Estimating Positive-incentive Noise [54.24688963649581]
We scientifically investigate the connection between contrastive learning and $pi$-noise.
Inspired by the idea of Positive-incentive Noise (Pi-Noise or $pi$-Noise) that aims at learning the reliable noise beneficial to tasks, we develop a $pi$-noise generator.
arXiv Detail & Related papers (2024-08-19T12:07:42Z) - Understanding the Effect of Noise in LLM Training Data with Algorithmic
Chains of Thought [0.0]
We study how noise in chain of thought impacts task performance in highly-controlled setting.
We define two types of noise: textitstatic noise, a local form of noise which is applied after the CoT trace is computed, and textitdynamic noise, a global form of noise which propagates errors in the trace as it is computed.
We find fine-tuned models are extremely robust to high levels of static noise but struggle significantly more with lower levels of dynamic noise.
arXiv Detail & Related papers (2024-02-06T13:59:56Z) - NoisyNN: Exploring the Influence of Information Entropy Change in
Learning Systems [25.05692528736342]
We show that specific noise can boost the performance of various deep architectures under certain conditions.
We categorize the noise into two types, positive noise (PN) and harmful noise (HN), based on whether the noise can help reduce the complexity of the task.
arXiv Detail & Related papers (2023-09-19T14:04:04Z) - Walking Noise: On Layer-Specific Robustness of Neural Architectures against Noisy Computations and Associated Characteristic Learning Dynamics [1.5184189132709105]
We discuss the implications of additive, multiplicative and mixed noise for different classification tasks and model architectures.
We propose a methodology called Walking Noise which injects layer-specific noise to measure the robustness.
We conclude with a discussion of the use of this methodology in practice, among others, discussing its use for tailored multi-execution in noisy environments.
arXiv Detail & Related papers (2022-12-20T17:09:08Z) - CNT (Conditioning on Noisy Targets): A new Algorithm for Leveraging
Top-Down Feedback [25.964963416932573]
We propose a novel regularizer for supervised learning called Conditioning on Noisy Targets (CNT)
This approach consists in conditioning the model on a noisy version of the target(s) at a random noise level.
At inference time, since we do not know the target, we run the network with only noise in place of the noisy target.
arXiv Detail & Related papers (2022-10-18T00:54:40Z) - Learning to Generate Realistic Noisy Images via Pixel-level Noise-aware
Adversarial Training [50.018580462619425]
We propose a novel framework, namely Pixel-level Noise-aware Generative Adrial Network (PNGAN)
PNGAN employs a pre-trained real denoiser to map the fake and real noisy images into a nearly noise-free solution space.
For better noise fitting, we present an efficient architecture Simple Multi-versa-scale Network (SMNet) as the generator.
arXiv Detail & Related papers (2022-04-06T14:09:02Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - C2N: Practical Generative Noise Modeling for Real-World Denoising [53.96391787869974]
We introduce a Clean-to-Noisy image generation framework, namely C2N, to imitate complex real-world noise without using paired examples.
We construct the noise generator in C2N accordingly with each component of real-world noise characteristics to express a wide range of noise accurately.
arXiv Detail & Related papers (2022-02-19T05:53:46Z) - Dual Adversarial Network: Toward Real-world Noise Removal and Noise
Generation [52.75909685172843]
Real-world image noise removal is a long-standing yet very challenging task in computer vision.
We propose a novel unified framework to deal with the noise removal and noise generation tasks.
Our method learns the joint distribution of the clean-noisy image pairs.
arXiv Detail & Related papers (2020-07-12T09:16:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.