BSNN: Towards Faster and Better Conversion of Artificial Neural Networks
to Spiking Neural Networks with Bistable Neurons
- URL: http://arxiv.org/abs/2105.12917v1
- Date: Thu, 27 May 2021 02:38:02 GMT
- Title: BSNN: Towards Faster and Better Conversion of Artificial Neural Networks
to Spiking Neural Networks with Bistable Neurons
- Authors: Yang Li, Yi Zeng, Dongcheng Zhao
- Abstract summary: spiking neural network (SNN) computes and communicates information through discrete binary events.
Recent work has achieved essential progress on an excellent performance by converting artificial neural networks (ANNs) to SNN.
We propose a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag.
- Score: 8.555786938446133
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The spiking neural network (SNN) computes and communicates information
through discrete binary events. It is considered more biologically plausible
and more energy-efficient than artificial neural networks (ANN) in emerging
neuromorphic hardware. However, due to the discontinuous and non-differentiable
characteristics, training SNN is a relatively challenging task. Recent work has
achieved essential progress on an excellent performance by converting ANN to
SNN. Due to the difference in information processing, the converted deep SNN
usually suffers serious performance loss and large time delay. In this paper,
we analyze the reasons for the performance loss and propose a novel bistable
spiking neural network (BSNN) that addresses the problem of spikes of
inactivated neurons (SIN) caused by the phase lead and phase lag. Also, when
ResNet structure-based ANNs are converted, the information of output neurons is
incomplete due to the rapid transmission of the shortcut path. We design
synchronous neurons (SN) to help efficiently improve performance. Experimental
results show that the proposed method only needs 1/4-1/10 of the time steps
compared to previous work to achieve nearly lossless conversion. We demonstrate
state-of-the-art ANN-SNN conversion for VGG16, ResNet20, and ResNet34 on
challenging datasets including CIFAR-10 (95.16% top-1), CIFAR-100 (78.12%
top-1), and ImageNet (72.64% top-1).
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Optimal ANN-SNN Conversion with Group Neurons [39.14228133571838]
Spiking Neural Networks (SNNs) have emerged as a promising third generation of neural networks.
The lack of effective learning algorithms remains a challenge for SNNs.
We introduce a novel type of neuron called Group Neurons (GNs)
arXiv Detail & Related papers (2024-02-29T11:41:12Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - Reducing ANN-SNN Conversion Error through Residual Membrane Potential [19.85338979292052]
Spiking Neural Networks (SNNs) have received extensive academic attention due to the unique properties of low power consumption and high-speed computing on neuromorphic chips.
In this paper, we make a detailed analysis of unevenness error and divide it into four categories.
We propose an optimization strategy based on residual membrane potential to reduce unevenness error.
arXiv Detail & Related papers (2023-02-04T04:44:31Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Training Deep Spiking Neural Networks [0.0]
Brain-inspired spiking neural networks (SNNs) with neuromorphic hardware may offer orders of magnitude higher energy efficiency.
We show that is is possible to train SNN with ResNet50 architecture on CIFAR100 and Imagenette object recognition datasets.
The trained SNN falls behind in accuracy compared to analogous ANN but requires several orders of magnitude less inference time steps.
arXiv Detail & Related papers (2020-06-08T09:47:05Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper
High-Accuracy and Low-Latency Spiking Neural Network [11.447730771403464]
Spiking Neural Networks (SNNs) have attracted significant research interest as the third generation of artificial neural networks.
We show near loss-less ANN-SNN conversion using RMP neurons for VGG-16, ResNet-20, and ResNet-34 SNNs on challenging datasets.
arXiv Detail & Related papers (2020-02-25T18:19:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.