Long Short-Term Memory Spiking Networks and Their Applications
- URL: http://arxiv.org/abs/2007.04779v1
- Date: Thu, 9 Jul 2020 13:22:27 GMT
- Title: Long Short-Term Memory Spiking Networks and Their Applications
- Authors: Ali Lotfi Rezaabad and Sriram Vishwanath
- Abstract summary: We present a novel framework for training recurrent spiking neural networks (SNNs)
We show that LSTM spiking networks learn the timing of the spikes and temporal dependencies.
We also develop a methodology for error backpropagation within LSTM-based SNNs.
- Score: 10.071615423169902
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances in event-based neuromorphic systems have resulted in
significant interest in the use and development of spiking neural networks
(SNNs). However, the non-differentiable nature of spiking neurons makes SNNs
incompatible with conventional backpropagation techniques. In spite of the
significant progress made in training conventional deep neural networks (DNNs),
training methods for SNNs still remain relatively poorly understood. In this
paper, we present a novel framework for training recurrent SNNs. Analogous to
the benefits presented by recurrent neural networks (RNNs) in learning time
series models within DNNs, we develop SNNs based on long short-term memory
(LSTM) networks. We show that LSTM spiking networks learn the timing of the
spikes and temporal dependencies. We also develop a methodology for error
backpropagation within LSTM-based SNNs. The developed architecture and method
for backpropagation within LSTM-based SNNs enable them to learn long-term
dependencies with comparable results to conventional LSTMs.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods [33.377770671553336]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs)
In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance.
arXiv Detail & Related papers (2024-05-06T09:58:54Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural
Networks [20.33499499020257]
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process.
We propose an efficient evolutionary structure learning framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from scratch.
Our work presents a brand-new approach for sparse training of SNNs from scratch with biologically plausible evolutionary mechanisms.
arXiv Detail & Related papers (2023-06-06T14:06:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Fluctuation-driven initialization for spiking neural network training [3.976291254896486]
Spiking neural networks (SNNs) underlie low-power, fault-tolerant information processing in the brain.
We develop a general strategy for SNNs inspired by the fluctuation-driven regime commonly observed in the brain.
arXiv Detail & Related papers (2022-06-21T09:48:49Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Deep Learning in Spiking Phasor Neural Networks [0.6767885381740952]
Spiking Neural Networks (SNNs) have attracted the attention of the deep learning community for use in low-latency, low-power neuromorphic hardware.
In this paper, we introduce Spiking Phasor Neural Networks (SPNNs)
SPNNs are based on complex-valued Deep Neural Networks (DNNs), representing phases by spike times.
arXiv Detail & Related papers (2022-04-01T15:06:15Z) - A Time Encoding approach to training Spiking Neural Networks [3.655021726150368]
Spiking Neural Networks (SNNs) have been gaining in popularity.
In this paper, we provide an extra tool to help us understand and train SNNs by using theory from the field of time encoding.
arXiv Detail & Related papers (2021-10-13T14:07:11Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.