Skip-Connected Self-Recurrent Spiking Neural Networks with Joint
Intrinsic Parameter and Synaptic Weight Training
- URL: http://arxiv.org/abs/2010.12691v1
- Date: Fri, 23 Oct 2020 22:27:13 GMT
- Title: Skip-Connected Self-Recurrent Spiking Neural Networks with Joint
Intrinsic Parameter and Synaptic Weight Training
- Authors: Wenrui Zhang, Peng Li
- Abstract summary: We propose a new type of RSNN called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs)
ScSr-SNNs can boost performance by up to 2.55% compared with other types of RSNNs trained by state-of-the-art BP methods.
- Score: 14.992756670960008
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: As an important class of spiking neural networks (SNNs), recurrent spiking
neural networks (RSNNs) possess great computational power and have been widely
used for processing sequential data like audio and text. However, most RSNNs
suffer from two problems. 1. Due to a lack of architectural guidance, random
recurrent connectivity is often adopted, which does not guarantee good
performance. 2. Training of RSNNs is in general challenging, bottlenecking
achievable model accuracy. To address these problems, we propose a new type of
RSNNs called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs). Recurrence in
ScSr-SNNs is introduced in a stereotyped manner by adding self-recurrent
connections to spiking neurons, which implements local memory. The network
dynamics is enriched by skip connections between nonadjacent layers.
Constructed by simplified self-recurrent and skip connections, ScSr-SNNs are
able to realize recurrent behaviors similar to those of more complex RSNNs
while the error gradients can be more straightforwardly calculated due to the
mostly feedforward nature of the network. Moreover, we propose a new
backpropagation (BP) method called backpropagated intrinsic plasticity (BIP) to
further boost the performance of ScSr-SNNs by training intrinsic model
parameters. Unlike standard intrinsic plasticity rules that adjust the neuron's
intrinsic parameters according to neuronal activity, the proposed BIP methods
optimize intrinsic parameters based on the backpropagated error gradient of a
well-defined global loss function in addition to synaptic weight training.
Based upon challenging speech and neuromorphic speech datasets including
TI46-Alpha, TI46-Digits, and N-TIDIGITS, the proposed ScSr-SNNs can boost
performance by up to 2.55% compared with other types of RSNNs trained by
state-of-the-art BP methods.
Related papers
- A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Multi-Level Firing with Spiking DS-ResNet: Enabling Better and Deeper
Directly-Trained Spiking Neural Networks [19.490903216456758]
Spiking neural networks (SNNs) are neural networks with asynchronous discrete and sparse characteristics.
We propose a multi-level firing (MLF) method based on the existing spiking-suppressed residual network (spiking DS-ResNet)
arXiv Detail & Related papers (2022-10-12T16:39:46Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - BackEISNN: A Deep Spiking Neural Network with Adaptive Self-Feedback and
Balanced Excitatory-Inhibitory Neurons [8.956708722109415]
Spiking neural networks (SNNs) transmit information through discrete spikes, which performs well in processing spatial-temporal information.
We propose a deep spiking neural network with adaptive self-feedback and balanced excitatory and inhibitory neurons (BackEISNN)
For the MNIST, FashionMNIST, and N-MNIST datasets, our model has achieved state-of-the-art performance.
arXiv Detail & Related papers (2021-05-27T08:38:31Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.