Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation
- URL: http://arxiv.org/abs/2205.00459v2
- Date: Thu, 30 Mar 2023 07:12:36 GMT
- Title: Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation
- Authors: Qingyan Meng, Mingqing Xiao, Shen Yan, Yisen Wang, Zhouchen Lin,
Zhi-Quan Luo
- Abstract summary: Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
- Score: 70.75043144299168
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Network (SNN) is a promising energy-efficient AI model when
implemented on neuromorphic hardware. However, it is a challenge to efficiently
train SNNs due to their non-differentiability. Most existing methods either
suffer from high latency (i.e., long simulation time steps), or cannot achieve
as high performance as Artificial Neural Networks (ANNs). In this paper, we
propose the Differentiation on Spike Representation (DSR) method, which could
achieve high performance that is competitive to ANNs yet with low latency.
First, we encode the spike trains into spike representation using (weighted)
firing rate coding. Based on the spike representation, we systematically derive
that the spiking dynamics with common neural models can be represented as some
sub-differentiable mapping. With this viewpoint, our proposed DSR method trains
SNNs through gradients of the mapping and avoids the common
non-differentiability problem in SNN training. Then we analyze the error when
representing the specific mapping with the forward computation of the SNN. To
reduce such error, we propose to train the spike threshold in each layer, and
to introduce a new hyperparameter for the neural models. With these components,
the DSR method can achieve state-of-the-art SNN performance with low latency on
both static and neuromorphic datasets, including CIFAR-10, CIFAR-100, ImageNet,
and DVS-CIFAR10.
Related papers
- Scaling Spike-driven Transformer with Efficient Spike Firing Approximation Training [17.193023656793464]
The ambition of brain-inspired Spiking Neural Networks (SNNs) is to become a low-power alternative to traditional Artificial Neural Networks (ANNs)
This work addresses two major challenges in realizing this vision: the performance gap between SNNs and ANNs, and the high training costs of SNNs.
We identify intrinsic flaws in spiking neurons caused by binary firing mechanisms and propose a Spike Firing Approximation (SFA) method using integer training and spike-driven inference.
arXiv Detail & Related papers (2024-11-25T03:05:41Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Real Spike: Learning Real-valued Spikes for Spiking Neural Networks [11.580346172925323]
Brain-inspired spiking neural networks (SNNs) have recently drawn more and more attention due to their event-driven and energy-efficient characteristics.
In this paper, we argue that SNNs may not benefit from the weight-sharing mechanism, which can effectively reduce parameters and improve inference efficiency.
Motivated by this assumption, a training-inference decoupling method for SNNs named as Real Spike is proposed.
arXiv Detail & Related papers (2022-10-13T02:45:50Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike
Hybrid Input Encoding [5.725845886457027]
Spiking Neural Networks (SNNs) provide higher computational efficiency in event driven neuromorphic hardware.
SNNs suffer from high inference latency, resulting from inefficient input encoding and training techniques.
This paper presents a training framework for low-latency energy-efficient SNNs.
arXiv Detail & Related papers (2021-07-26T06:16:40Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike
Timing Dependent Backpropagation [10.972663738092063]
Spiking Neural Networks (SNNs) operate with asynchronous discrete events (or spikes)
We present a computationally-efficient training technique for deep SNNs.
We achieve top-1 accuracy of 65.19% for ImageNet dataset on SNN with 250 time steps, which is 10X faster compared to converted SNNs with similar accuracy.
arXiv Detail & Related papers (2020-05-04T19:30:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.