Spikformer: When Spiking Neural Network Meets Transformer
- URL: http://arxiv.org/abs/2209.15425v1
- Date: Thu, 29 Sep 2022 14:16:49 GMT
- Title: Spikformer: When Spiking Neural Network Meets Transformer
- Authors: Zhaokun Zhou, Yuesheng Zhu, Chao He, Yaowei Wang, Shuicheng Yan,
Yonghong Tian, Li Yuan
- Abstract summary: We consider two biologically plausible structures, the Spiking Neural Network (SNN) and the self-attention mechanism.
We propose a novel Spiking Self Attention (SSA) as well as a powerful framework, named Spiking Transformer (Spikformer)
- Score: 102.91330530210037
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider two biologically plausible structures, the Spiking Neural Network
(SNN) and the self-attention mechanism. The former offers an energy-efficient
and event-driven paradigm for deep learning, while the latter has the ability
to capture feature dependencies, enabling Transformer to achieve good
performance. It is intuitively promising to explore the marriage between them.
In this paper, we consider leveraging both self-attention capability and
biological properties of SNNs, and propose a novel Spiking Self Attention (SSA)
as well as a powerful framework, named Spiking Transformer (Spikformer). The
SSA mechanism in Spikformer models the sparse visual feature by using
spike-form Query, Key, and Value without softmax. Since its computation is
sparse and avoids multiplication, SSA is efficient and has low computational
energy consumption. It is shown that Spikformer with SSA can outperform the
state-of-the-art SNNs-like frameworks in image classification on both
neuromorphic and static datasets. Spikformer (66.3M parameters) with comparable
size to SEW-ResNet-152 (60.2M,69.26%) can achieve 74.81% top1 accuracy on
ImageNet using 4 time steps, which is the state-of-the-art in directly trained
SNNs models.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.