Low Latency of object detection for spikng neural network
- URL: http://arxiv.org/abs/2309.15555v1
- Date: Wed, 27 Sep 2023 10:26:19 GMT
- Title: Low Latency of object detection for spikng neural network
- Authors: Nemin Qiu and Chuang Zhu
- Abstract summary: Spiking Neural Networks are well-suited for edge AI applications due to their binary spike nature.
In this paper, we focus on generating highly accurate and low-latency SNNs specifically for object detection.
- Score: 3.404826786562694
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks, as a third-generation neural network, are
well-suited for edge AI applications due to their binary spike nature. However,
when it comes to complex tasks like object detection, SNNs often require a
substantial number of time steps to achieve high performance. This limitation
significantly hampers the widespread adoption of SNNs in latency-sensitive edge
devices. In this paper, our focus is on generating highly accurate and
low-latency SNNs specifically for object detection. Firstly, we systematically
derive the conversion between SNNs and ANNs and analyze how to improve the
consistency between them: improving the spike firing rate and reducing the
quantization error. Then we propose a structural replacement, quantization of
ANN activation and residual fix to allevicate the disparity. We evaluate our
method on challenging dataset MS COCO, PASCAL VOC and our spike dataset. The
experimental results show that the proposed method achieves higher accuracy and
lower latency compared to previous work Spiking-YOLO. The advantages of SNNs
processing of spike signals are also demonstrated.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Toward End-to-End Bearing Fault Diagnosis for Industrial Scenarios with Spiking Neural Networks [6.686258023516048]
Spiking neural networks (SNNs) transmit information via low-power binary spikes.
We propose a Multi-scale Residual Attention SNN (MRA-SNN) to improve efficiency, performance, and robustness of SNN methods.
MRA-SNN significantly outperforms existing methods in terms of accuracy, energy consumption noise robustness, and is more feasible for deployment in real-world industrial scenarios.
arXiv Detail & Related papers (2024-08-17T06:41:58Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - Highly Efficient SNNs for High-speed Object Detection [7.3074002563489024]
Experimental results show that our efficient SNN can achieve 118X speedup on GPU with only 1.5MB parameters for object detection tasks.
We further verify our SNN on FPGA platform and the proposed model can achieve 800+FPS object detection with extremely low latency.
arXiv Detail & Related papers (2023-09-27T10:31:12Z) - Enabling energy-Efficient object detection with surrogate gradient
descent in spiking neural networks [0.40054215937601956]
Spiking Neural Networks (SNNs) are a biologically plausible neural network model with significant advantages in both event-driven processing and processing-temporal information.
In this study, we introduce the Current Mean Decoding (CMD) method, which solves the regression problem to facilitate the training of deep SNNs for object detection tasks.
Based on the gradient surrogate and CMD, we propose the SNN-YOLOv3 model for object detection.
arXiv Detail & Related papers (2023-09-07T15:48:00Z) - Inherent Redundancy in Spiking Neural Networks [24.114844269113746]
Spiking Networks (SNNs) are a promising energy-efficient alternative to conventional artificial neural networks.
In this work, we focus on three key questions regarding inherent redundancy in SNNs.
We propose an Advance Attention (ASA) module to harness SNNs' redundancy.
arXiv Detail & Related papers (2023-08-16T08:58:25Z) - Spiking Neural Network for Ultra-low-latency and High-accurate Object
Detection [18.037802439500858]
Spiking Neural Networks (SNNs) have garnered widespread interest for their energy efficiency and brain-inspired event-driven properties.
Recent methods like Spiking-YOLO have expanded the SNNs to more challenging object detection tasks.
They often suffer from high latency and low detection accuracy, making them difficult to deploy on latency sensitive mobile platforms.
arXiv Detail & Related papers (2023-06-21T04:21:40Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.