Desire Backpropagation: A Lightweight Training Algorithm for Multi-Layer
Spiking Neural Networks based on Spike-Timing-Dependent Plasticity
- URL: http://arxiv.org/abs/2211.05412v2
- Date: Tue, 17 Oct 2023 04:46:17 GMT
- Title: Desire Backpropagation: A Lightweight Training Algorithm for Multi-Layer
Spiking Neural Networks based on Spike-Timing-Dependent Plasticity
- Authors: Daniel Gerlinghoff, Tao Luo, Rick Siow Mong Goh, Weng-Fai Wong
- Abstract summary: Spiking neural networks (SNNs) are a viable alternative to conventional artificial neural networks.
We present desire backpropagation, a method to derive the desired spike activity of all neurons, including the hidden ones.
We trained three-layer networks to classify MNIST and Fashion-MNIST images and reached an accuracy of 98.41% and 87.56%, respectively.
- Score: 13.384228628766236
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) are a viable alternative to conventional
artificial neural networks when resource efficiency and computational
complexity are of importance. A major advantage of SNNs is their binary
information transfer through spike trains which eliminates multiplication
operations. The training of SNNs has, however, been a challenge, since neuron
models are non-differentiable and traditional gradient-based backpropagation
algorithms cannot be applied directly. Furthermore, spike-timing-dependent
plasticity (STDP), albeit being a spike-based learning rule, updates weights
locally and does not optimize for the output error of the network. We present
desire backpropagation, a method to derive the desired spike activity of all
neurons, including the hidden ones, from the output error. By incorporating
this desire value into the local STDP weight update, we can efficiently capture
the neuron dynamics while minimizing the global error and attaining a high
classification accuracy. That makes desire backpropagation a spike-based
supervised learning rule. We trained three-layer networks to classify MNIST and
Fashion-MNIST images and reached an accuracy of 98.41% and 87.56%,
respectively. In addition, by eliminating a multiplication during the backward
pass, we reduce computational complexity and balance arithmetic resources
between forward and backward pass, making desire backpropagation a candidate
for training on low-resource devices.
Related papers
- Stepwise Weighted Spike Coding for Deep Spiking Neural Networks [7.524721345903027]
Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of biological neurons.
We propose a novel Stepwise Weighted Spike (SWS) coding scheme to enhance the encoding of information in spikes.
This approach compresses the spikes by weighting the significance of the spike in each step of neural computation, achieving high performance and low energy consumption.
arXiv Detail & Related papers (2024-08-30T12:39:25Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - A temporally and spatially local spike-based backpropagation algorithm
to enable training in hardware [0.0]
Spiking Neural Networks (SNNs) have emerged as a hardware efficient architecture for classification tasks.
There have been several attempts to adopt the powerful backpropagation (BP) technique used in non-spiking artificial neural networks (ANNs)
arXiv Detail & Related papers (2022-07-20T08:57:53Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Selfish Sparse RNN Training [13.165729746380816]
We propose an approach to train sparse RNNs with a fixed parameter count in one single run, without compromising performance.
We achieve state-of-the-art sparse training results with various datasets on Penn TreeBank and Wikitext-2.
arXiv Detail & Related papers (2021-01-22T10:45:40Z) - Skip-Connected Self-Recurrent Spiking Neural Networks with Joint
Intrinsic Parameter and Synaptic Weight Training [14.992756670960008]
We propose a new type of RSNN called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs)
ScSr-SNNs can boost performance by up to 2.55% compared with other types of RSNNs trained by state-of-the-art BP methods.
arXiv Detail & Related papers (2020-10-23T22:27:13Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.