RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper
High-Accuracy and Low-Latency Spiking Neural Network
- URL: http://arxiv.org/abs/2003.01811v2
- Date: Wed, 1 Apr 2020 17:27:05 GMT
- Title: RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper
High-Accuracy and Low-Latency Spiking Neural Network
- Authors: Bing Han, Gopalakrishnan Srinivasan, and Kaushik Roy
- Abstract summary: Spiking Neural Networks (SNNs) have attracted significant research interest as the third generation of artificial neural networks.
We show near loss-less ANN-SNN conversion using RMP neurons for VGG-16, ResNet-20, and ResNet-34 SNNs on challenging datasets.
- Score: 11.447730771403464
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) have recently attracted significant research
interest as the third generation of artificial neural networks that can enable
low-power event-driven data analytics. The best performing SNNs for image
recognition tasks are obtained by converting a trained Analog Neural Network
(ANN), consisting of Rectified Linear Units (ReLU), to SNN composed of
integrate-and-fire neurons with "proper" firing thresholds. The converted SNNs
typically incur loss in accuracy compared to that provided by the original ANN
and require sizable number of inference time-steps to achieve the best
accuracy. We find that performance degradation in the converted SNN stems from
using "hard reset" spiking neuron that is driven to fixed reset potential once
its membrane potential exceeds the firing threshold, leading to information
loss during SNN inference. We propose ANN-SNN conversion using "soft reset"
spiking neuron model, referred to as Residual Membrane Potential (RMP) spiking
neuron, which retains the "residual" membrane potential above threshold at the
firing instants. We demonstrate near loss-less ANN-SNN conversion using RMP
neurons for VGG-16, ResNet-20, and ResNet-34 SNNs on challenging datasets
including CIFAR-10 (93.63% top-1), CIFAR-100 (70.93% top-1), and ImageNet
(73.09% top-1 accuracy). Our results also show that RMP-SNN surpasses the best
inference accuracy provided by the converted SNN with "hard reset" spiking
neurons using 2-8 times fewer inference time-steps across network architectures
and datasets.
Related papers
- NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - Optimal ANN-SNN Conversion with Group Neurons [39.14228133571838]
Spiking Neural Networks (SNNs) have emerged as a promising third generation of neural networks.
The lack of effective learning algorithms remains a challenge for SNNs.
We introduce a novel type of neuron called Group Neurons (GNs)
arXiv Detail & Related papers (2024-02-29T11:41:12Z) - Low Latency Conversion of Artificial Neural Network Models to
Rate-encoded Spiking Neural Networks [11.300257721586432]
Spiking neural networks (SNNs) are well suited for resource-constrained applications.
In a typical rate-encoded SNN, a series of binary spikes within a globally fixed time window is used to fire the neurons.
The aim of this paper is to reduce this while maintaining accuracy when converting ANNs to their equivalent SNNs.
arXiv Detail & Related papers (2022-10-27T08:13:20Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - BSNN: Towards Faster and Better Conversion of Artificial Neural Networks
to Spiking Neural Networks with Bistable Neurons [8.555786938446133]
spiking neural network (SNN) computes and communicates information through discrete binary events.
Recent work has achieved essential progress on an excellent performance by converting artificial neural networks (ANNs) to SNN.
We propose a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag.
arXiv Detail & Related papers (2021-05-27T02:38:02Z) - Pruning of Deep Spiking Neural Networks through Gradient Rewiring [41.64961999525415]
Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips.
Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs.
We propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain.
arXiv Detail & Related papers (2021-05-11T10:05:53Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.