Is Conventional SNN Really Efficient? A Perspective from Network
Quantization
- URL: http://arxiv.org/abs/2311.10802v1
- Date: Fri, 17 Nov 2023 09:48:22 GMT
- Title: Is Conventional SNN Really Efficient? A Perspective from Network
Quantization
- Authors: Guobin Shen, Dongcheng Zhao, Tenglong Li, Jindong Li, Yi Zeng
- Abstract summary: Spiking Neural Networks (SNNs) have been widely praised for their high energy efficiency and immense potential.
However, comprehensive research that critically contrasts and correlates SNNs with quantized Artificial Neural Networks (ANNs) remains scant.
This paper introduces a unified perspective, illustrating that the time steps in SNNs and quantized bit-widths of activation values present analogous representations.
- Score: 7.04833025737147
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) have been widely praised for their high energy
efficiency and immense potential. However, comprehensive research that
critically contrasts and correlates SNNs with quantized Artificial Neural
Networks (ANNs) remains scant, often leading to skewed comparisons lacking
fairness towards ANNs. This paper introduces a unified perspective,
illustrating that the time steps in SNNs and quantized bit-widths of activation
values present analogous representations. Building on this, we present a more
pragmatic and rational approach to estimating the energy consumption of SNNs.
Diverging from the conventional Synaptic Operations (SynOps), we champion the
"Bit Budget" concept. This notion permits an intricate discourse on
strategically allocating computational and storage resources between weights,
activation values, and temporal steps under stringent hardware constraints.
Guided by the Bit Budget paradigm, we discern that pivoting efforts towards
spike patterns and weight quantization, rather than temporal attributes,
elicits profound implications for model performance. Utilizing the Bit Budget
for holistic design consideration of SNNs elevates model performance across
diverse data types, encompassing static imagery and neuromorphic datasets. Our
revelations bridge the theoretical chasm between SNNs and quantized ANNs and
illuminate a pragmatic trajectory for future endeavors in energy-efficient
neural computations.
Related papers
- Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - Are SNNs Truly Energy-efficient? $-$ A Hardware Perspective [7.539212567508529]
Spiking Neural Networks (SNNs) have gained attention for their energy-efficient machine learning capabilities.
This work studies two hardware benchmarking platforms for large-scale SNN inference, namely SATA and SpikeSim.
arXiv Detail & Related papers (2023-09-06T22:23:22Z) - An Automata-Theoretic Approach to Synthesizing Binarized Neural Networks [13.271286153792058]
Quantized neural networks (QNNs) have been developed, with binarized neural networks (BNNs) restricted to binary values as a special case.
This paper presents an automata-theoretic approach to synthesizing BNNs that meet designated properties.
arXiv Detail & Related papers (2023-07-29T06:27:28Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - SATA: Sparsity-Aware Training Accelerator for Spiking Neural Networks [4.44525458129903]
Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs)
We introduce SATA (Sparsity-Aware Training Accelerator), a BPTT-based training accelerator for SNNs.
By utilizing the sparsity, SATA increases its computation energy efficiency by $5.58 times$ compared to the one without using sparsity.
arXiv Detail & Related papers (2022-04-11T21:49:45Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - SiamSNN: Siamese Spiking Neural Networks for Energy-Efficient Object
Tracking [20.595208488431766]
SiamSNN is the first deep SNN tracker that achieves short latency and low precision loss on the visual object tracking benchmarks OTB2013, VOT2016, and GOT-10k.
SiamSNN notably achieves low energy consumption and real-time on Neuromorphic chip TrueNorth.
arXiv Detail & Related papers (2020-03-17T08:49:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.