Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks
- URL: http://arxiv.org/abs/2305.16044v6
- Date: Fri, 15 Sep 2023 02:55:04 GMT
- Title: Exploiting Noise as a Resource for Computation and Learning in Spiking
Neural Networks
- Authors: Gehua Ma, Rui Yan, Huajin Tang
- Abstract summary: This study introduces the noisy spiking neural network (NSNN) and the noise-driven learning rule (NDL)
NSNN provides a theoretical framework that yields scalable, flexible, and reliable computation.
- Score: 32.0086664373154
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: $\textbf{Formal version available at}$
https://cell.com/patterns/fulltext/S2666-3899(23)00200-3
Networks of spiking neurons underpin the extraordinary information-processing
capabilities of the brain and have become pillar models in neuromorphic
artificial intelligence. Despite extensive research on spiking neural networks
(SNNs), most studies are established on deterministic models, overlooking the
inherent non-deterministic, noisy nature of neural computations. This study
introduces the noisy spiking neural network (NSNN) and the noise-driven
learning rule (NDL) by incorporating noisy neuronal dynamics to exploit the
computational advantages of noisy neural processing. NSNN provides a
theoretical framework that yields scalable, flexible, and reliable computation.
We demonstrate that NSNN leads to spiking neural models with competitive
performance, improved robustness against challenging perturbations than
deterministic SNNs, and better reproducing probabilistic computations in neural
coding. This study offers a powerful and easy-to-use tool for machine learning,
neuromorphic intelligence practitioners, and computational neuroscience
researchers.
Related papers
- Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Random-coupled Neural Network [17.53731608985241]
Pulse-coupled neural network (PCNN) is a well applicated model for imitating the characteristics of the human brain in computer vision and neural network fields.
In this study, random-coupled neural network (RCNN) is proposed.
It overcomes difficulties in PCNN's neuromorphic computing via a random inactivation process.
arXiv Detail & Related papers (2024-03-26T09:13:06Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Toward stochastic neural computing [11.955322183964201]
We propose a theory of neural computing in which streams of noisy inputs are transformed and processed through populations of spiking neurons.
We demonstrate the application of our method to Intel's Loihi neuromorphic hardware.
arXiv Detail & Related papers (2023-05-23T12:05:35Z) - Linear Leaky-Integrate-and-Fire Neuron Model Based Spiking Neural
Networks and Its Mapping Relationship to Deep Neural Networks [7.840247953745616]
Spiking neural networks (SNNs) are brain-inspired machine learning algorithms with merits such as biological plausibility and unsupervised learning capability.
This paper establishes a precise mathematical mapping between the biological parameters of the Linear Leaky-Integrate-and-Fire model (LIF)/SNNs and the parameters of ReLU-AN/Deep Neural Networks (DNNs)
arXiv Detail & Related papers (2022-05-31T17:02:26Z) - Deep Reinforcement Learning Guided Graph Neural Networks for Brain
Network Analysis [61.53545734991802]
We propose a novel brain network representation framework, namely BN-GNN, which searches for the optimal GNN architecture for each brain network.
Our proposed BN-GNN improves the performance of traditional GNNs on different brain network analysis tasks.
arXiv Detail & Related papers (2022-03-18T07:05:27Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Effective and Efficient Computation with Multiple-timescale Spiking
Recurrent Neural Networks [0.9790524827475205]
We show how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance.
We calculate a $>$100x energy improvement for our SRNNs over classical RNNs on the harder tasks.
arXiv Detail & Related papers (2020-05-24T01:04:53Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.