Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object
Recognition with Spiking Neural Network
- URL: http://arxiv.org/abs/2401.01912v1
- Date: Tue, 2 Jan 2024 02:05:05 GMT
- Title: Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object
Recognition with Spiking Neural Network
- Authors: Yongqi Ding, Lin Zuo, Mengmeng Jing, Pei He and Yongjun Xiao
- Abstract summary: Neuromorphic object recognition with spiking neural networks (SNNs) is the cornerstone of low-power neuromorphic computing.
Existing SNNs suffer from significant latency, utilizing 10 to 40 timesteps or more, to recognize neuromorphic objects.
In this work, we propose the Shrinking SNN (SSNN) to achieve low-latency neuromorphic object recognition without reducing performance.
- Score: 5.174808367448261
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neuromorphic object recognition with spiking neural networks (SNNs) is the
cornerstone of low-power neuromorphic computing. However, existing SNNs suffer
from significant latency, utilizing 10 to 40 timesteps or more, to recognize
neuromorphic objects. At low latencies, the performance of existing SNNs is
drastically degraded. In this work, we propose the Shrinking SNN (SSNN) to
achieve low-latency neuromorphic object recognition without reducing
performance. Concretely, we alleviate the temporal redundancy in SNNs by
dividing SNNs into multiple stages with progressively shrinking timesteps,
which significantly reduces the inference latency. During timestep shrinkage,
the temporal transformer smoothly transforms the temporal scale and preserves
the information maximally. Moreover, we add multiple early classifiers to the
SNN during training to mitigate the mismatch between the surrogate gradient and
the true gradient, as well as the gradient vanishing/exploding, thus
eliminating the performance degradation at low latency. Extensive experiments
on neuromorphic datasets, CIFAR10-DVS, N-Caltech101, and DVS-Gesture have
revealed that SSNN is able to improve the baseline accuracy by 6.55% ~ 21.41%.
With only 5 average timesteps and without any data augmentation, SSNN is able
to achieve an accuracy of 73.63% on CIFAR10-DVS. This work presents a
heterogeneous temporal scale SNN and provides valuable insights into the
development of high-performance, low-latency SNNs.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Enhancing Adversarial Robustness in SNNs with Sparse Gradients [46.15229142258264]
Spiking Neural Networks (SNNs) have attracted great attention for their energy-efficient operations and biologically inspired structures.
Existing techniques, whether adapted from ANNs or specifically designed for SNNs, exhibit limitations in training SNNs or defending against strong attacks.
We propose a novel approach to enhance the robustness of SNNs through gradient sparsity regularization.
arXiv Detail & Related papers (2024-05-30T05:39:27Z) - When Bio-Inspired Computing meets Deep Learning: Low-Latency, Accurate,
& Energy-Efficient Spiking Neural Networks from Artificial Neural Networks [22.721987637571306]
Spiking Neural Networks (SNNs) are demonstrating comparable accuracy to convolutional neural networks (CNN)
ANN-to-SNN conversion has recently gained significant traction in developing deep SNNs with close to state-of-the-art (SOTA) test accuracy on complex image recognition tasks.
We propose a novel ANN-to-SNN conversion framework, that incurs an exponentially lower number of time steps compared to that required in the SOTA conversion approaches.
arXiv Detail & Related papers (2023-12-12T00:10:45Z) - STCSNN: High energy efficiency spike-train level spiking neural networks with spatio-temporal conversion [4.892303151981707]
Brain-temporal spiking neuron networks (SNNs) have attracted widespread research interest due to their low power features, high biological plausibility, and strong information processing capability.
Although adopting a surrogate (SG) makes the non-differentiability SNN trainable, achieving comparable accuracy for ANNs and keeping low-power features simultaneously is still tricky.
In this paper, we proposed an energy-efficient spiking neural network withtemporal conversion, which has low computational cost and high accuracy.
arXiv Detail & Related papers (2023-07-14T03:27:34Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer [77.78479877473899]
We design a spatial-temporal-fusion BNN for efficiently scaling BNNs to large models.
Compared to vanilla BNNs, our approach can greatly reduce the training time and the number of parameters, which contributes to scale BNNs efficiently.
arXiv Detail & Related papers (2021-12-12T17:13:14Z) - BSNN: Towards Faster and Better Conversion of Artificial Neural Networks
to Spiking Neural Networks with Bistable Neurons [8.555786938446133]
spiking neural network (SNN) computes and communicates information through discrete binary events.
Recent work has achieved essential progress on an excellent performance by converting artificial neural networks (ANNs) to SNN.
We propose a novel bistable spiking neural network (BSNN) that addresses the problem of spikes of inactivated neurons (SIN) caused by the phase lead and phase lag.
arXiv Detail & Related papers (2021-05-27T02:38:02Z) - Sparse Spiking Gradient Descent [2.741266294612776]
We present the first sparse SNN backpropagation algorithm which achieves the same or better accuracy as current state of the art methods.
We show the effectiveness of our method on real datasets of varying complexity.
arXiv Detail & Related papers (2021-05-18T20:00:55Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.