Noise impact on recurrent neural network with linear activation function
- URL: http://arxiv.org/abs/2303.13262v1
- Date: Thu, 23 Mar 2023 13:43:05 GMT
- Title: Noise impact on recurrent neural network with linear activation function
- Authors: V.M. Moskvitin, N. Semenova
- Abstract summary: We study the peculiarities of internal noise propagation in recurrent ANN on the example of echo state network (ESN)
Here we consider the case when artificial neurons have linear activation function with different slope coefficients.
We have found that the general view of variance and signal-to-noise ratio of ESN output signal is similar to only one neuron.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, more and more researchers in the field of neural networks
are interested in creating hardware implementations where neurons and the
connection between them are realized physically. The physical implementation of
ANN fundamentally changes the features of noise influence. In the case hardware
ANNs, there are many internal sources of noise with different properties. The
purpose of this paper is to study the peculiarities of internal noise
propagation in recurrent ANN on the example of echo state network (ESN), to
reveal ways to suppress such noises and to justify the stability of networks to
some types of noises.
In this paper we analyse ESN in presence of uncorrelated additive and
multiplicative white Gaussian noise. Here we consider the case when artificial
neurons have linear activation function with different slope coefficients.
Starting from studying only one noisy neuron we complicate the problem by
considering how the input signal and the memory property affect the
accumulation of noise in ESN. In addition, we consider the influence of the
main types of coupling matrices on the accumulation of noise. So, as such
matrices, we take a uniform matrix and a diagonal-like matrices with different
coefficients called "blurring" coefficient.
We have found that the general view of variance and signal-to-noise ratio of
ESN output signal is similar to only one neuron. The noise is less accumulated
in ESN with diagonal reservoir connection matrix with large "blurring"
coefficient. Especially it concerns uncorrelated multiplicative noise.
Related papers
- Impact of internal noise on convolutional neural networks [0.0]
We study the impact of noise on a simplified trained convolutional network.<n>The propagation of uncorrelated noise depends on the statistical properties of the connection matrix.<n>An analysis of the noise level in the network's output signal shows a strong correlation with the results of numerical simulations.
arXiv Detail & Related papers (2025-05-10T11:49:37Z) - Internal noise in hardware deep and recurrent neural networks helps with learning [0.0]
Internal noise during the training of neural networks affects the final performance of recurrent and deep neural networks.
In most cases, both deep and echo state networks benefit from internal noise during training, as it enhances their resilience to noise.
arXiv Detail & Related papers (2025-04-18T16:26:46Z) - Impact of white noise in artificial neural networks trained for classification: performance and noise mitigation strategies [0.0]
We consider how additive and multiplicative Gaussian white noise on the neuronal level can affect the accuracy of the network.
We adapt several noise reduction techniques to the essential setting of classification tasks.
arXiv Detail & Related papers (2024-11-07T01:21:12Z) - Impact of white Gaussian internal noise on analog echo-state neural networks [0.0]
This paper studies the influence of noise on the functioning of recurrent networks using the example of trained echo state networks (ESNs)
We show that the propagation of noise in reservoir is mainly controlled by the statistical properties of the output connection matrix.
We also show that there are conditions under which even noise with an intensity of $10-20$ is already enough to completely lose the useful signal.
arXiv Detail & Related papers (2024-05-13T11:59:20Z) - Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Effects of noise on the overparametrization of quantum neural networks [0.0]
We show that noise can "turn on" previously-zero eigenvalues of the QFIM.
Our results imply that current QNN capacity measures are ill-defined when hardware noise is present.
arXiv Detail & Related papers (2023-02-10T05:33:52Z) - Physics-informed Neural Networks with Unknown Measurement Noise [0.6906005491572401]
We show that the standard PINN framework breaks down in case of non-Gaussian noise.
We propose to jointly train an energy-based model (EBM) to learn the correct noise distribution.
arXiv Detail & Related papers (2022-11-28T16:17:47Z) - Cross-Frequency Coupling Increases Memory Capacity in Oscillatory Neural
Networks [69.42260428921436]
Cross-frequency coupling (CFC) is associated with information integration across populations of neurons.
We construct a model of CFC which predicts a computational role for observed $theta - gamma$ oscillatory circuits in the hippocampus and cortex.
We show that the presence of CFC increases the memory capacity of a population of neurons connected by plastic synapses.
arXiv Detail & Related papers (2022-04-05T17:13:36Z) - Learning Noise via Dynamical Decoupling of Entangled Qubits [49.38020717064383]
Noise in entangled quantum systems is difficult to characterize due to many-body effects involving multiple degrees of freedom.
We develop and apply multi-qubit dynamical decoupling sequences that characterize noise that occurs during two-qubit gates.
arXiv Detail & Related papers (2022-01-26T20:22:38Z) - Understanding and mitigating noise in trained deep neural networks [0.0]
We study the propagation of noise in deep neural networks comprising noisy nonlinear neurons in trained fully connected layers.
We find that noise accumulation is generally bound, and adding additional network layers does not worsen the signal to noise ratio beyond a limit.
We identify criteria allowing engineers to design noise-resilient novel neural network hardware.
arXiv Detail & Related papers (2021-03-12T17:16:26Z) - Asymmetric Heavy Tails and Implicit Bias in Gaussian Noise Injections [73.95786440318369]
We focus on the so-called implicit effect' of GNIs, which is the effect of the injected noise on the dynamics of gradient descent (SGD)
We show that this effect induces an asymmetric heavy-tailed noise on gradient updates.
We then formally prove that GNIs induce an implicit bias', which varies depending on the heaviness of the tails and the level of asymmetry.
arXiv Detail & Related papers (2021-02-13T21:28:09Z) - Deep Networks for Direction-of-Arrival Estimation in Low SNR [89.45026632977456]
We introduce a Convolutional Neural Network (CNN) that is trained from mutli-channel data of the true array manifold matrix.
We train a CNN in the low-SNR regime to predict DoAs across all SNRs.
Our robust solution can be applied in several fields, ranging from wireless array sensors to acoustic microphones or sonars.
arXiv Detail & Related papers (2020-11-17T12:52:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.