NoisyNN: Exploring the Influence of Information Entropy Change in
Learning Systems
- URL: http://arxiv.org/abs/2309.10625v3
- Date: Fri, 2 Feb 2024 20:02:38 GMT
- Title: NoisyNN: Exploring the Influence of Information Entropy Change in
Learning Systems
- Authors: Xiaowei Yu, Zhe Huang, Yao Xue, Lu Zhang, Li Wang, Tianming Liu,
Dajiang Zhu
- Abstract summary: We show that specific noise can boost the performance of various deep architectures under certain conditions.
We categorize the noise into two types, positive noise (PN) and harmful noise (HN), based on whether the noise can help reduce the complexity of the task.
- Score: 25.05692528736342
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We explore the impact of entropy change in deep learning systems via noise
injection at different levels, i.e., the latent space and input image. The
series of models that employ our methodology are collectively known as Noisy
Neural Networks (NoisyNN), with examples such as NoisyViT and NoisyCNN. Noise
is conventionally viewed as a harmful perturbation in various deep learning
architectures, such as convolutional neural networks (CNNs) and vision
transformers (ViTs), as well as different learning tasks like image
classification and transfer learning. However, this work shows noise can be an
effective way to change the entropy of the learning system. We demonstrate that
specific noise can boost the performance of various deep architectures under
certain conditions. We theoretically prove the enhancement gained from positive
noise by reducing the task complexity defined by information entropy and
experimentally show the significant performance gain in large image datasets,
such as the ImageNet. Herein, we use the information entropy to define the
complexity of the task. We categorize the noise into two types, positive noise
(PN) and harmful noise (HN), based on whether the noise can help reduce the
complexity of the task. Extensive experiments of CNNs and ViTs have shown
performance improvements by proactively injecting positive noise, where we
achieved an unprecedented top 1 accuracy of over 95$\%$ on ImageNet. Both
theoretical analysis and empirical evidence have confirmed that the presence of
positive noise, can benefit the learning process, while the traditionally
perceived harmful noise indeed impairs deep learning models. The different
roles of noise offer new explanations for deep models on specific tasks and
provide a new paradigm for improving model performance. Moreover, it reminds us
that we can influence the performance of learning systems via information
entropy change.
Related papers
- Data Augmentation in Training CNNs: Injecting Noise to Images [0.0]
This study analyzes the effects of adding or applying different noise models of varying magnitudes to CNN architectures.
Basic results are conforming to the most of the common notions in machine learning.
New approaches will provide better understanding on optimal learning procedures for image classification.
arXiv Detail & Related papers (2023-07-12T17:29:42Z) - Masked Image Training for Generalizable Deep Image Denoising [53.03126421917465]
We present a novel approach to enhance the generalization performance of denoising networks.
Our method involves masking random pixels of the input image and reconstructing the missing information during training.
Our approach exhibits better generalization ability than other deep learning models and is directly applicable to real-world scenarios.
arXiv Detail & Related papers (2023-03-23T09:33:44Z) - Noise Injection as a Probe of Deep Learning Dynamics [0.0]
We propose a new method to probe the learning mechanism of Deep Neural Networks (DNN) by perturbing the system using Noise Injection Nodes (NINs)
We find that the system displays distinct phases during training, dictated by the scale of injected noise.
In some cases, the evolution of the noise nodes is similar to that of the unperturbed loss, thus indicating the possibility of using NINs to learn more about the full system in the future.
arXiv Detail & Related papers (2022-10-24T20:51:59Z) - Treatment Learning Causal Transformer for Noisy Image Classification [62.639851972495094]
In this work, we incorporate this binary information of "existence of noise" as treatment into image classification tasks to improve prediction accuracy.
Motivated from causal variational inference, we propose a transformer-based architecture, that uses a latent generative model to estimate robust feature representations for noise image classification.
We also create new noisy image datasets incorporating a wide range of noise factors for performance benchmarking.
arXiv Detail & Related papers (2022-03-29T13:07:53Z) - Practical Blind Image Denoising via Swin-Conv-UNet and Data Synthesis [148.16279746287452]
We propose a swin-conv block to incorporate the local modeling ability of residual convolutional layer and non-local modeling ability of swin transformer block.
For the training data synthesis, we design a practical noise degradation model which takes into consideration different kinds of noise.
Experiments on AGWN removal and real image denoising demonstrate that the new network architecture design achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-03-24T18:11:31Z) - Physics-based Noise Modeling for Extreme Low-light Photography [63.65570751728917]
We study the noise statistics in the imaging pipeline of CMOS photosensors.
We formulate a comprehensive noise model that can accurately characterize the real noise structures.
Our noise model can be used to synthesize realistic training data for learning-based low-light denoising algorithms.
arXiv Detail & Related papers (2021-08-04T16:36:29Z) - Unpaired Learning of Deep Image Denoising [80.34135728841382]
This paper presents a two-stage scheme by incorporating self-supervised learning and knowledge distillation.
For self-supervised learning, we suggest a dilated blind-spot network (D-BSN) to learn denoising solely from real noisy images.
Experiments show that our unpaired learning method performs favorably on both synthetic noisy images and real-world noisy photographs.
arXiv Detail & Related papers (2020-08-31T16:22:40Z) - Robust Processing-In-Memory Neural Networks via Noise-Aware
Normalization [26.270754571140735]
PIM accelerators often suffer from intrinsic noise in the physical components.
We propose a noise-agnostic method to achieve robust neural network performance against any noise setting.
arXiv Detail & Related papers (2020-07-07T06:51:28Z) - Variational Denoising Network: Toward Blind Noise Modeling and Removal [59.36166491196973]
Blind image denoising is an important yet very challenging problem in computer vision.
We propose a new variational inference method, which integrates both noise estimation and image denoising.
arXiv Detail & Related papers (2019-08-29T15:54:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.