MSAT: Biologically Inspired Multi-Stage Adaptive Threshold for
Conversion of Spiking Neural Networks
- URL: http://arxiv.org/abs/2303.13080v1
- Date: Thu, 23 Mar 2023 07:18:08 GMT
- Title: MSAT: Biologically Inspired Multi-Stage Adaptive Threshold for
Conversion of Spiking Neural Networks
- Authors: Xiang He, Yang Li, Dongcheng Zhao, Qingqun Kong, Yi Zeng
- Abstract summary: Spiking Neural Networks (SNNs) can do inference with low power consumption due to their spike sparsity.
ANN-SNN conversion is an efficient way to achieve deep SNNs by converting well-trained Artificial Neural Networks (ANNs)
Existing methods commonly use constant threshold for conversion, which prevents neurons from rapidly delivering spikes to deeper layers.
- Score: 11.392893261073594
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) can do inference with low power consumption
due to their spike sparsity. ANN-SNN conversion is an efficient way to achieve
deep SNNs by converting well-trained Artificial Neural Networks (ANNs).
However, the existing methods commonly use constant threshold for conversion,
which prevents neurons from rapidly delivering spikes to deeper layers and
causes high time delay. In addition, the same response for different inputs may
result in information loss during the information transmission. Inspired by the
biological model mechanism, we propose a multi-stage adaptive threshold (MSAT).
Specifically, for each neuron, the dynamic threshold varies with firing history
and input properties and is positively correlated with the average membrane
potential and negatively correlated with the rate of depolarization. The
self-adaptation to membrane potential and input allows a timely adjustment of
the threshold to fire spike faster and transmit more information. Moreover, we
analyze the Spikes of Inactivated Neurons error which is pervasive in early
time steps and propose spike confidence accordingly as a measurement of
confidence about the neurons that correctly deliver spikes. We use such spike
confidence in early time steps to determine whether to elicit spike to
alleviate this error. Combined with the proposed method, we examine the
performance on non-trivial datasets CIFAR-10, CIFAR-100, and ImageNet. We also
conduct sentiment classification and speech recognition experiments on the IDBM
and Google speech commands datasets respectively. Experiments show
near-lossless and lower latency ANN-SNN conversion. To the best of our
knowledge, this is the first time to build a biologically inspired multi-stage
adaptive threshold for converted SNN, with comparable performance to
state-of-the-art methods while improving energy efficiency.
Related papers
- Discovering Long-Term Effects on Parameter Efficient Fine-tuning [36.83255498301937]
Pre-trained Artificial Neural Networks (Annns) exhibit robust pattern recognition capabilities.
Annns and BNNs share extensive similarities with the human brain, specifically Biological Neural Networks (BNNs)
Annns can acquire new knowledge through fine-tuning.
arXiv Detail & Related papers (2024-08-24T03:27:29Z) - Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Co-learning synaptic delays, weights and adaptation in spiking neural
networks [0.0]
Spiking neural networks (SNN) distinguish themselves from artificial neural networks (ANN) because of their inherent temporal processing and spike-based computations.
We show that data processing with spiking neurons can be enhanced by co-learning the connection weights with two other biologically inspired neuronal features.
arXiv Detail & Related papers (2023-09-12T09:13:26Z) - Deep Multi-Threshold Spiking-UNet for Image Processing [51.88730892920031]
This paper introduces the novel concept of Spiking-UNet for image processing, which combines the power of Spiking Neural Networks (SNNs) with the U-Net architecture.
To achieve an efficient Spiking-UNet, we face two primary challenges: ensuring high-fidelity information propagation through the network via spikes and formulating an effective training strategy.
Experimental results show that, on image segmentation and denoising, our Spiking-UNet achieves comparable performance to its non-spiking counterpart.
arXiv Detail & Related papers (2023-07-20T16:00:19Z) - Bridging the Gap between ANNs and SNNs by Calibrating Offset Spikes [19.85338979292052]
Spiking Neural Networks (SNNs) have attracted great attention due to their distinctive characteristics of low power consumption and temporal information processing.
ANN-SNN conversion, as the most commonly used training method for applying SNNs, can ensure that converted SNNs achieve comparable performance to ANNs on large-scale datasets.
In this paper, instead of evaluating different conversion errors and then eliminating these errors, we define an offset spike to measure the degree of deviation between actual and desired SNN firing rates.
arXiv Detail & Related papers (2023-02-21T14:10:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Efficient and Accurate Conversion of Spiking Neural Network with Burst
Spikes [9.210531698373256]
Spiking neural network (SNN) as a brain-inspired energy-efficient neural network has attracted the interest of researchers.
One effective way is to map the weight of trained ANN to SNN to achieve high reasoning ability.
The converted spiking neural network often suffers from performance degradation and a considerable time delay.
We propose a neuron model for releasing burst spikes, a cheap but highly efficient method to solve residual information.
arXiv Detail & Related papers (2022-04-28T03:48:17Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Incorporating Learnable Membrane Time Constant to Enhance Learning of
Spiking Neural Networks [36.16846259899793]
Spiking Neural Networks (SNNs) have attracted enormous research interest due to temporal information processing capability, low power consumption, and high biological plausibility.
Most existing learning methods learn weights only, and require manual tuning of the membrane-related parameters that determine the dynamics of a single spiking neuron.
In this paper, we take inspiration from the observation that membrane-related parameters are different across brain regions, and propose a training algorithm that is capable of learning not only the synaptic weights but also the membrane time constants of SNNs.
arXiv Detail & Related papers (2020-07-11T14:35:42Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.