Attention Spiking Neural Networks
- URL: http://arxiv.org/abs/2209.13929v1
- Date: Wed, 28 Sep 2022 09:00:45 GMT
- Title: Attention Spiking Neural Networks
- Authors: Man Yao, Guangshe Zhao, Hengyu Zhang, Yifan Hu, Lei Deng, Yonghong
Tian, Bo Xu, and Guoqi Li
- Abstract summary: We study the effect of attention mechanisms in spiking neural networks (SNNs)
New attention SNN architecture with end-to-end training called "MA-SNN" is proposed.
Experiments are conducted in event-based DVS128 Gesture/Gait action recognition and ImageNet-1k image classification.
- Score: 32.591900260554326
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Benefiting from the event-driven and sparse spiking characteristics of the
brain, spiking neural networks (SNNs) are becoming an energy-efficient
alternative to artificial neural networks (ANNs). However, the performance gap
between SNNs and ANNs has been a great hindrance to deploying SNNs ubiquitously
for a long time. To leverage the full potential of SNNs, we study the effect of
attention mechanisms in SNNs. We first present our idea of attention with a
plug-and-play kit, termed the Multi-dimensional Attention (MA). Then, a new
attention SNN architecture with end-to-end training called "MA-SNN" is
proposed, which infers attention weights along the temporal, channel, as well
as spatial dimensions separately or simultaneously. Based on the existing
neuroscience theories, we exploit the attention weights to optimize membrane
potentials, which in turn regulate the spiking response in a data-dependent
way. At the cost of negligible additional parameters, MA facilitates vanilla
SNNs to achieve sparser spiking activity, better performance, and energy
efficiency concurrently. Experiments are conducted in event-based DVS128
Gesture/Gait action recognition and ImageNet-1k image classification. On
Gesture/Gait, the spike counts are reduced by 84.9%/81.6%, and the task
accuracy and energy efficiency are improved by 5.9%/4.7% and
3.4$\times$/3.2$\times$. On ImageNet-1K, we achieve top-1 accuracy of 75.92%
and 77.08% on single/4-step Res-SNN-104, which are state-of-the-art results in
SNNs. To our best knowledge, this is for the first time, that the SNN community
achieves comparable or even better performance compared with its ANN
counterpart in the large-scale dataset. Our work lights up SNN's potential as a
general backbone to support various applications for SNNs, with a great balance
between effectiveness and efficiency.
Related papers
- Membership Privacy Evaluation in Deep Spiking Neural Networks [32.42695393291052]
Spiking Neural Networks (SNNs) mimic neurons with non-linear functions to output floating-point numbers.
In this paper, we evaluate the membership privacy of SNNs by considering eight MIAs.
We show that SNNs are more vulnerable (maximum 10% higher in terms of balanced attack accuracy) than ANNs when both are trained with neuromorphic datasets.
arXiv Detail & Related papers (2024-09-28T17:13:04Z) - NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - Enhancing Adversarial Robustness in SNNs with Sparse Gradients [46.15229142258264]
Spiking Neural Networks (SNNs) have attracted great attention for their energy-efficient operations and biologically inspired structures.
Existing techniques, whether adapted from ANNs or specifically designed for SNNs, exhibit limitations in training SNNs or defending against strong attacks.
We propose a novel approach to enhance the robustness of SNNs through gradient sparsity regularization.
arXiv Detail & Related papers (2024-05-30T05:39:27Z) - Spikeformer: A Novel Architecture for Training High-Performance
Low-Latency Spiking Neural Network [6.8125324121155275]
We propose a novel Transformer-based SNN,termed "Spikeformer",which outperforms its ANN counterpart on both static dataset and neuromorphic dataset.
Remarkably,our Spikeformer outperforms other SNNs on ImageNet by a large margin (i.e.more than 5%) and even outperforms its ANN counterpart by 3.1% and 2.2% on DVS-Gesture and ImageNet.
arXiv Detail & Related papers (2022-11-19T12:49:22Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking
Neural Networks [117.56823277328803]
Spiking neural networks are efficient computation models for low-power environments.
We propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way.
Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets.
arXiv Detail & Related papers (2022-06-19T16:52:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Advancing Residual Learning towards Powerful Deep Spiking Neural
Networks [16.559670769601038]
Residual learning and shortcuts have been evidenced as an important approach for training deep neural networks.
MS-ResNet is able to significantly extend the depth of directly trained SNNs.
MS-ResNet 104 achieves 76.02% accuracy on ImageNet, the first time in the domain of directly trained SNNs.
arXiv Detail & Related papers (2021-12-15T05:47:21Z) - Spiking Neural Networks for Visual Place Recognition via Weighted
Neuronal Assignments [24.754429120321365]
Spiking neural networks (SNNs) offer both compelling potential advantages, including energy efficiency and low latencies.
One promising area for high performance SNNs is template matching and image recognition.
This research introduces the first high performance SNN for the Visual Place Recognition (VPR) task.
arXiv Detail & Related papers (2021-09-14T05:40:40Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.