Constructing Deep Spiking Neural Networks from Artificial Neural
Networks with Knowledge Distillation
- URL: http://arxiv.org/abs/2304.05627v2
- Date: Mon, 17 Apr 2023 03:07:53 GMT
- Title: Constructing Deep Spiking Neural Networks from Artificial Neural
Networks with Knowledge Distillation
- Authors: Qi Xu, Yaxin Li, Jiangrong Shen, Jian K Liu, Huajin Tang, Gang Pan
- Abstract summary: Spiking neural networks (SNNs) are well known as the brain-inspired models with high computing efficiency.
We propose a novel method of constructing deep SNN models with knowledge distillation (KD)
- Score: 20.487853773309563
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) are well known as the brain-inspired models
with high computing efficiency, due to a key component that they utilize spikes
as information units, close to the biological neural systems. Although spiking
based models are energy efficient by taking advantage of discrete spike
signals, their performance is limited by current network structures and their
training methods. As discrete signals, typical SNNs cannot apply the gradient
descent rules directly into parameters adjustment as artificial neural networks
(ANNs). Aiming at this limitation, here we propose a novel method of
constructing deep SNN models with knowledge distillation (KD) that uses ANN as
teacher model and SNN as student model. Through ANN-SNN joint training
algorithm, the student SNN model can learn rich feature information from the
teacher ANN model through the KD method, yet it avoids training SNN from
scratch when communicating with non-differentiable spikes. Our method can not
only build a more efficient deep spiking structure feasibly and reasonably, but
use few time steps to train whole model compared to direct training or ANN to
SNN methods. More importantly, it has a superb ability of noise immunity for
various types of artificial noises and natural signals. The proposed novel
method provides efficient ways to improve the performance of SNN through
constructing deeper structures in a high-throughput fashion, with potential
usage for light and efficient brain-inspired computing of practical scenarios.
Related papers
- BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation [20.34272550256856]
Spiking neural networks (SNNs) mimic biological neural system to convey information via discrete spikes.
Our work achieves state-of-the-art performance for training SNNs on both static and neuromorphic datasets.
arXiv Detail & Related papers (2024-07-12T08:17:24Z) - Direct Training High-Performance Deep Spiking Neural Networks: A Review of Theories and Methods [33.377770671553336]
Spiking neural networks (SNNs) offer a promising energy-efficient alternative to artificial neural networks (ANNs)
In this paper, we provide a new perspective to summarize the theories and methods for training deep SNNs with high performance.
arXiv Detail & Related papers (2024-05-06T09:58:54Z) - Fully Spiking Denoising Diffusion Implicit Models [61.32076130121347]
Spiking neural networks (SNNs) have garnered considerable attention owing to their ability to run on neuromorphic devices with super-high speeds.
We propose a novel approach fully spiking denoising diffusion implicit model (FSDDIM) to construct a diffusion model within SNNs.
We demonstrate that the proposed method outperforms the state-of-the-art fully spiking generative model.
arXiv Detail & Related papers (2023-12-04T09:07:09Z) - ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural
Networks [20.33499499020257]
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process.
We propose an efficient evolutionary structure learning framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from scratch.
Our work presents a brand-new approach for sparse training of SNNs from scratch with biologically plausible evolutionary mechanisms.
arXiv Detail & Related papers (2023-06-06T14:06:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Skip Connections in Spiking Neural Networks: An Analysis of Their Effect
on Network Training [0.8602553195689513]
Spiking neural networks (SNNs) have gained attention as a promising alternative to traditional artificial neural networks (ANNs)
In this paper, we study the impact of skip connections on SNNs and propose a hyper parameter optimization technique that adapts models from ANN to SNN.
We demonstrate that optimizing the position, type, and number of skip connections can significantly improve the accuracy and efficiency of SNNs.
arXiv Detail & Related papers (2023-03-23T07:57:32Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Deep Time Delay Neural Network for Speech Enhancement with Full Data
Learning [60.20150317299749]
This paper proposes a deep time delay neural network (TDNN) for speech enhancement with full data learning.
To make full use of the training data, we propose a full data learning method for speech enhancement.
arXiv Detail & Related papers (2020-11-11T06:32:37Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Distilling Spikes: Knowledge Distillation in Spiking Neural Networks [22.331135708302586]
Spiking Neural Networks (SNNs) are energy-efficient computing architectures that exchange spikes for processing information.
We propose techniques for knowledge distillation in spiking neural networks for the task of image classification.
Our approach is expected to open up new avenues for deploying high performing large SNN models on resource-constrained hardware platforms.
arXiv Detail & Related papers (2020-05-01T09:36:32Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.