Impact of internal noise on convolutional neural networks
- URL: http://arxiv.org/abs/2505.06611v1
- Date: Sat, 10 May 2025 11:49:37 GMT
- Title: Impact of internal noise on convolutional neural networks
- Authors: Ivan Kolesnikov, Nadezhda Semenova,
- Abstract summary: We study the impact of noise on a simplified trained convolutional network.<n>The propagation of uncorrelated noise depends on the statistical properties of the connection matrix.<n>An analysis of the noise level in the network's output signal shows a strong correlation with the results of numerical simulations.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we investigate the impact of noise on a simplified trained convolutional network. The types of noise studied originate from a real optical implementation of a neural network, but we generalize these types to enhance the applicability of our findings on a broader scale. The noise types considered include additive and multiplicative noise, which relate to how noise affects individual neurons, as well as correlated and uncorrelated noise, which pertains to the influence of noise across one layers. We demonstrate that the propagation of uncorrelated noise primarily depends on the statistical properties of the connection matrices. Specifically, the mean value of the connection matrix following the layer impacted by noise governs the propagation of correlated additive noise, while the mean of its square contributes to the accumulation of uncorrelated noise. Additionally, we propose an analytical assessment of the noise level in the network's output signal, which shows a strong correlation with the results of numerical simulations.
Related papers
- Internal noise in hardware deep and recurrent neural networks helps with learning [0.0]
Internal noise during the training of neural networks affects the final performance of recurrent and deep neural networks.<n>In most cases, both deep and echo state networks benefit from internal noise during training, as it enhances their resilience to noise.
arXiv Detail & Related papers (2025-04-18T16:26:46Z) - Impact of white noise in artificial neural networks trained for classification: performance and noise mitigation strategies [0.0]
We consider how additive and multiplicative Gaussian white noise on the neuronal level can affect the accuracy of the network.
We adapt several noise reduction techniques to the essential setting of classification tasks.
arXiv Detail & Related papers (2024-11-07T01:21:12Z) - Dynamics of decoherence in a noisy driven environment [0.0]
We show that decoherence due to the nonequilibrium critical dynamics of the environment is amplified in the presence of uncorrelated and correlated noise.<n>We find that strong coupling between the qubit and the environment leads to partial revivals of decoherence.<n>We explore the non-Markovianity of the dynamics, finding that it decays in the presence of noise but increases as the noise correlation time grows.
arXiv Detail & Related papers (2024-09-02T17:12:00Z) - Impact of white Gaussian internal noise on analog echo-state neural networks [0.0]
This paper studies the influence of noise on the functioning of recurrent networks using the example of trained echo state networks (ESNs)
We show that the propagation of noise in reservoir is mainly controlled by the statistical properties of the output connection matrix.
We also show that there are conditions under which even noise with an intensity of $10-20$ is already enough to completely lose the useful signal.
arXiv Detail & Related papers (2024-05-13T11:59:20Z) - Feature Noise Boosts DNN Generalization under Label Noise [65.36889005555669]
The presence of label noise in the training data has a profound impact on the generalization of deep neural networks (DNNs)
In this study, we introduce and theoretically demonstrate a simple feature noise method, which directly adds noise to the features of training data.
arXiv Detail & Related papers (2023-08-03T08:31:31Z) - Noise impact on recurrent neural network with linear activation function [0.0]
We study the peculiarities of internal noise propagation in recurrent ANN on the example of echo state network (ESN)
Here we consider the case when artificial neurons have linear activation function with different slope coefficients.
We have found that the general view of variance and signal-to-noise ratio of ESN output signal is similar to only one neuron.
arXiv Detail & Related papers (2023-03-23T13:43:05Z) - The Effect of Non-Gaussian Noise on Auto-correlative Weak-value Amplification [2.5631808142941415]
We study the effect of non-Gaussian noise on the auto-correlative weak-value amplification (AWVA) technique.<n>In particular, two types of noise with a negative-dB signal-to-noise ratio, frequency-stationary noises and frequency-nonstationary noises are studied.
arXiv Detail & Related papers (2022-09-26T14:34:42Z) - Robust Semantic Communications with Masked VQ-VAE Enabled Codebook [56.63571713657059]
We propose a framework for the robust end-to-end semantic communication systems to combat the semantic noise.
To combat the semantic noise, the adversarial training with weight is developed to incorporate the samples with semantic noise in the training dataset.
We develop a feature importance module (FIM) to suppress the noise-related and task-unrelated features.
arXiv Detail & Related papers (2022-06-08T16:58:47Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - Learning Noise via Dynamical Decoupling of Entangled Qubits [49.38020717064383]
Noise in entangled quantum systems is difficult to characterize due to many-body effects involving multiple degrees of freedom.
We develop and apply multi-qubit dynamical decoupling sequences that characterize noise that occurs during two-qubit gates.
arXiv Detail & Related papers (2022-01-26T20:22:38Z) - Removing Noise from Extracellular Neural Recordings Using Fully
Convolutional Denoising Autoencoders [62.997667081978825]
We propose a Fully Convolutional Denoising Autoencoder, which learns to produce a clean neuronal activity signal from a noisy multichannel input.
The experimental results on simulated data show that our proposed method can improve significantly the quality of noise-corrupted neural signals.
arXiv Detail & Related papers (2021-09-18T14:51:24Z) - Adaptive noise imitation for image denoising [58.21456707617451]
We develop a new textbfadaptive noise imitation (ADANI) algorithm that can synthesize noisy data from naturally noisy images.
To produce realistic noise, a noise generator takes unpaired noisy/clean images as input, where the noisy image is a guide for noise generation.
Coupling the noisy data output from ADANI with the corresponding ground-truth, a denoising CNN is then trained in a fully-supervised manner.
arXiv Detail & Related papers (2020-11-30T02:49:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.