Rethinking Pretraining as a Bridge from ANNs to SNNs
- URL: http://arxiv.org/abs/2203.01158v3
- Date: Fri, 4 Mar 2022 03:07:22 GMT
- Title: Rethinking Pretraining as a Bridge from ANNs to SNNs
- Authors: Yihan Lin, Yifan Hu, Shijie Ma, Guoqi Li, Dongjie Yu
- Abstract summary: Spiking neural networks (SNNs) are known as a typical kind of brain-inspired models with their unique features.
How to obtain a high-accuracy model has always been the main challenge in the field of SNN.
- Score: 13.984523794353477
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Spiking neural networks (SNNs) are known as a typical kind of brain-inspired
models with their unique features of rich neuronal dynamics, diverse coding
schemes and low power consumption properties. How to obtain a high-accuracy
model has always been the main challenge in the field of SNN. Currently, there
are two mainstream methods, i.e., obtaining a converted SNN through converting
a well-trained Artificial Neural Network (ANN) to its SNN counterpart or
training an SNN directly. However, the inference time of a converted SNN is too
long, while SNN training is generally very costly and inefficient. In this
work, a new SNN training paradigm is proposed by combining the concepts of the
two different training methods with the help of the pretrain technique and
BP-based deep SNN training mechanism. We believe that the proposed paradigm is
a more efficient pipeline for training SNNs. The pipeline includes pipeS for
static data transfer tasks and pipeD for dynamic data transfer tasks. SOTA
results are obtained in a large-scale event-driven dataset ES-ImageNet. For
training acceleration, we achieve the same (or higher) best accuracy as similar
LIF-SNNs using 1/10 training time on ImageNet-1K and 2/5 training time on
ES-ImageNet and also provide a time-accuracy benchmark for a new dataset
ES-UCF101. These experimental results reveal the similarity of the functions of
parameters between ANNs and SNNs and also demonstrate the various potential
applications of this SNN training pipeline.
Related papers
- Optimising Event-Driven Spiking Neural Network with Regularisation and
Cutoff [33.91830001268308]
Spiking neural network (SNN) offers promising improvements in computational efficiency.
Current SNN training methodologies predominantly employ a fixed timestep approach.
We propose to consider cutoff in SNN, which can terminate SNN anytime during the inference to achieve efficient inference.
arXiv Detail & Related papers (2023-01-23T16:14:09Z) - Spikeformer: A Novel Architecture for Training High-Performance
Low-Latency Spiking Neural Network [6.8125324121155275]
We propose a novel Transformer-based SNN,termed "Spikeformer",which outperforms its ANN counterpart on both static dataset and neuromorphic dataset.
Remarkably,our Spikeformer outperforms other SNNs on ImageNet by a large margin (i.e.more than 5%) and even outperforms its ANN counterpart by 3.1% and 2.2% on DVS-Gesture and ImageNet.
arXiv Detail & Related papers (2022-11-19T12:49:22Z) - SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking
Neural Networks [117.56823277328803]
Spiking neural networks are efficient computation models for low-power environments.
We propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way.
Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets.
arXiv Detail & Related papers (2022-06-19T16:52:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Can Deep Neural Networks be Converted to Ultra Low-Latency Spiking
Neural Networks? [3.2108350580418166]
Spiking neural networks (SNNs) operate via binary spikes distributed over time.
SOTA training strategies for SNNs involve conversion from a non-spiking deep neural network (DNN)
We propose a new training algorithm that accurately captures these distributions, minimizing the error between the DNN and converted SNN.
arXiv Detail & Related papers (2021-12-22T18:47:45Z) - Spatial-Temporal-Fusion BNN: Variational Bayesian Feature Layer [77.78479877473899]
We design a spatial-temporal-fusion BNN for efficiently scaling BNNs to large models.
Compared to vanilla BNNs, our approach can greatly reduce the training time and the number of parameters, which contributes to scale BNNs efficiently.
arXiv Detail & Related papers (2021-12-12T17:13:14Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Sub-bit Neural Networks: Learning to Compress and Accelerate Binary
Neural Networks [72.81092567651395]
Sub-bit Neural Networks (SNNs) are a new type of binary quantization design tailored to compress and accelerate BNNs.
SNNs are trained with a kernel-aware optimization framework, which exploits binary quantization in the fine-grained convolutional kernel space.
Experiments on visual recognition benchmarks and the hardware deployment on FPGA validate the great potentials of SNNs.
arXiv Detail & Related papers (2021-10-18T11:30:29Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Training Deep Spiking Neural Networks [0.0]
Brain-inspired spiking neural networks (SNNs) with neuromorphic hardware may offer orders of magnitude higher energy efficiency.
We show that is is possible to train SNN with ResNet50 architecture on CIFAR100 and Imagenette object recognition datasets.
The trained SNN falls behind in accuracy compared to analogous ANN but requires several orders of magnitude less inference time steps.
arXiv Detail & Related papers (2020-06-08T09:47:05Z) - Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike
Timing Dependent Backpropagation [10.972663738092063]
Spiking Neural Networks (SNNs) operate with asynchronous discrete events (or spikes)
We present a computationally-efficient training technique for deep SNNs.
We achieve top-1 accuracy of 65.19% for ImageNet dataset on SNN with 250 time steps, which is 10X faster compared to converted SNNs with similar accuracy.
arXiv Detail & Related papers (2020-05-04T19:30:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.