Pruning of Deep Spiking Neural Networks through Gradient Rewiring
- URL: http://arxiv.org/abs/2105.04916v1
- Date: Tue, 11 May 2021 10:05:53 GMT
- Title: Pruning of Deep Spiking Neural Networks through Gradient Rewiring
- Authors: Yanqi Chen, Zhaofei Yu, Wei Fang, Tiejun Huang and Yonghong Tian
- Abstract summary: Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips.
Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs.
We propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain.
- Score: 41.64961999525415
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) have been attached great importance due to
their biological plausibility and high energy-efficiency on neuromorphic chips.
As these chips are usually resource-constrained, the compression of SNNs is
thus crucial along the road of practical use of SNNs. Most existing methods
directly apply pruning approaches in artificial neural networks (ANNs) to SNNs,
which ignore the difference between ANNs and SNNs, thus limiting the
performance of the pruned SNNs. Besides, these methods are only suitable for
shallow SNNs. In this paper, inspired by synaptogenesis and synapse elimination
in the neural system, we propose gradient rewiring (Grad R), a joint learning
algorithm of connectivity and weight for SNNs, that enables us to seamlessly
optimize network structure without retrain. Our key innovation is to redefine
the gradient to a new synaptic parameter, allowing better exploration of
network structures by taking full advantage of the competition between pruning
and regrowth of connections. The experimental results show that the proposed
method achieves minimal loss of SNNs' performance on MNIST and CIFAR-10 dataset
so far. Moreover, it reaches a $\sim$3.5% accuracy loss under unprecedented
0.73% connectivity, which reveals remarkable structure refining capability in
SNNs. Our work suggests that there exists extremely high redundancy in deep
SNNs. Our codes are available at
\url{https://github.com/Yanqi-Chen/Gradient-Rewiring}.
Related papers
- LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural
Networks [20.33499499020257]
Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process.
We propose an efficient evolutionary structure learning framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from scratch.
Our work presents a brand-new approach for sparse training of SNNs from scratch with biologically plausible evolutionary mechanisms.
arXiv Detail & Related papers (2023-06-06T14:06:11Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Beyond Classification: Directly Training Spiking Neural Networks for
Semantic Segmentation [5.800785186389827]
Spiking Neural Networks (SNNs) have emerged as the low-power alternative to Artificial Neural Networks (ANNs)
In this paper, we explore the SNN applications beyond classification and present semantic segmentation networks configured with spiking neurons.
arXiv Detail & Related papers (2021-10-14T21:53:03Z) - Explore the Knowledge contained in Network Weights to Obtain Sparse
Neural Networks [2.649890751459017]
This paper proposes a novel learning approach to obtain sparse fully connected layers in neural networks (NNs) automatically.
We design a switcher neural network (SNN) to optimize the structure of the task neural network (TNN)
arXiv Detail & Related papers (2021-03-26T11:29:40Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.