Efficient and Accurate Conversion of Spiking Neural Network with Burst
Spikes
- URL: http://arxiv.org/abs/2204.13271v1
- Date: Thu, 28 Apr 2022 03:48:17 GMT
- Title: Efficient and Accurate Conversion of Spiking Neural Network with Burst
Spikes
- Authors: Yang Li, Yi Zeng
- Abstract summary: Spiking neural network (SNN) as a brain-inspired energy-efficient neural network has attracted the interest of researchers.
One effective way is to map the weight of trained ANN to SNN to achieve high reasoning ability.
The converted spiking neural network often suffers from performance degradation and a considerable time delay.
We propose a neuron model for releasing burst spikes, a cheap but highly efficient method to solve residual information.
- Score: 9.210531698373256
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural network (SNN), as a brain-inspired energy-efficient neural
network, has attracted the interest of researchers. While the training of
spiking neural networks is still an open problem. One effective way is to map
the weight of trained ANN to SNN to achieve high reasoning ability. However,
the converted spiking neural network often suffers from performance degradation
and a considerable time delay. To speed up the inference process and obtain
higher accuracy, we theoretically analyze the errors in the conversion process
from three perspectives: the differences between IF and ReLU, time dimension,
and pooling operation. We propose a neuron model for releasing burst spikes, a
cheap but highly efficient method to solve residual information. In addition,
Lateral Inhibition Pooling (LIPooling) is proposed to solve the inaccuracy
problem caused by MaxPooling in the conversion process. Experimental results on
CIFAR and ImageNet demonstrate that our algorithm is efficient and accurate.
For example, our method can ensure nearly lossless conversion of SNN and only
use about 1/10 (less than 100) simulation time under 0.693$\times$ energy
consumption of the typical method. Our code is available at
https://github.com/Brain-Inspired-Cognitive-Engine/Conversion_Burst.
Related papers
- Obtaining Optimal Spiking Neural Network in Sequence Learning via CRNN-SNN Conversion [12.893883491781697]
Spiking neural networks (SNNs) are a promising alternative to conventional artificial neural networks (ANNs)
We design two sub-pipelines to support the end-to-end conversion of different structures in neural networks.
We show the effectiveness of our method over short and long timescales compared with the state-of-the-art learning- and conversion-based methods.
arXiv Detail & Related papers (2024-08-18T08:23:51Z) - Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - FTBC: Forward Temporal Bias Correction for Optimizing ANN-SNN Conversion [16.9748086865693]
Spiking Neural Networks (SNNs) offer a promising avenue for energy-efficient computing compared with Artificial Neural Networks (ANNs)
In this work, we introduce a lightweight Forward Temporal Bias (FTBC) technique, aimed at enhancing conversion accuracy without the computational overhead.
We further propose an algorithm for finding the temporal bias only in the forward pass, thus eliminating the computational burden of backpropagation.
arXiv Detail & Related papers (2024-03-27T09:25:20Z) - Deep Multi-Threshold Spiking-UNet for Image Processing [51.88730892920031]
This paper introduces the novel concept of Spiking-UNet for image processing, which combines the power of Spiking Neural Networks (SNNs) with the U-Net architecture.
To achieve an efficient Spiking-UNet, we face two primary challenges: ensuring high-fidelity information propagation through the network via spikes and formulating an effective training strategy.
Experimental results show that, on image segmentation and denoising, our Spiking-UNet achieves comparable performance to its non-spiking counterpart.
arXiv Detail & Related papers (2023-07-20T16:00:19Z) - Bridging the Gap between ANNs and SNNs by Calibrating Offset Spikes [19.85338979292052]
Spiking Neural Networks (SNNs) have attracted great attention due to their distinctive characteristics of low power consumption and temporal information processing.
ANN-SNN conversion, as the most commonly used training method for applying SNNs, can ensure that converted SNNs achieve comparable performance to ANNs on large-scale datasets.
In this paper, instead of evaluating different conversion errors and then eliminating these errors, we define an offset spike to measure the degree of deviation between actual and desired SNN firing rates.
arXiv Detail & Related papers (2023-02-21T14:10:56Z) - Ultra-low Latency Adaptive Local Binary Spiking Neural Network with
Accuracy Loss Estimator [4.554628904670269]
We propose an ultra-low latency adaptive local binary spiking neural network (ALBSNN) with accuracy loss estimators.
Experimental results show that this method can reduce storage space by more than 20 % without losing network accuracy.
arXiv Detail & Related papers (2022-07-31T09:03:57Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer [77.78479877473899]
We design a spatial-temporal-fusion BNN for efficiently scaling BNNs to large models.
Compared to vanilla BNNs, our approach can greatly reduce the training time and the number of parameters, which contributes to scale BNNs efficiently.
arXiv Detail & Related papers (2021-12-12T17:13:14Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - BSNN: Towards Faster and Better Conversion of Artificial Neural Networks
to Spiking Neural Networks with Bistable Neurons [8.555786938446133]
spiking neural network (SNN) computes and communicates information through discrete binary events.
Recent work has achieved essential progress on an excellent performance by converting artificial neural networks (ANNs) to SNN.
We propose a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag.
arXiv Detail & Related papers (2021-05-27T02:38:02Z) - Boosting Deep Neural Networks with Geometrical Prior Knowledge: A Survey [77.99182201815763]
Deep Neural Networks (DNNs) achieve state-of-the-art results in many different problem settings.
DNNs are often treated as black box systems, which complicates their evaluation and validation.
One promising field, inspired by the success of convolutional neural networks (CNNs) in computer vision tasks, is to incorporate knowledge about symmetric geometrical transformations.
arXiv Detail & Related papers (2020-06-30T14:56:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.