Adaptive Sparse Structure Development with Pruning and Regeneration for
Spiking Neural Networks
- URL: http://arxiv.org/abs/2211.12219v1
- Date: Tue, 22 Nov 2022 12:23:30 GMT
- Title: Adaptive Sparse Structure Development with Pruning and Regeneration for
Spiking Neural Networks
- Authors: Bing Han, Feifei Zhao, Yi Zeng, Wenxuan Pan
- Abstract summary: Spiking Neural Networks (SNNs) have the natural advantage of drawing the sparse structural plasticity of brain development to alleviate the energy problems of deep neural networks.
This paper proposed a novel method for the adaptive structural development of SNN, introducing dendritic spine plasticity-based synaptic constraint, neuronal pruning and synaptic regeneration.
- Score: 6.760855795263126
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) are more biologically plausible and
computationally efficient. Therefore, SNNs have the natural advantage of
drawing the sparse structural plasticity of brain development to alleviate the
energy problems of deep neural networks caused by their complex and fixed
structures. However, previous SNNs compression works are lack of in-depth
inspiration from the brain development plasticity mechanism. This paper
proposed a novel method for the adaptive structural development of SNN
(SD-SNN), introducing dendritic spine plasticity-based synaptic constraint,
neuronal pruning and synaptic regeneration. We found that synaptic constraint
and neuronal pruning can detect and remove a large amount of redundancy in
SNNs, coupled with synaptic regeneration can effectively prevent and repair
over-pruning. Moreover, inspired by the neurotrophic hypothesis, neuronal
pruning rate and synaptic regeneration rate were adaptively adjusted during the
learning-while-pruning process, which eventually led to the structural
stability of SNNs. Experimental results on spatial (MNIST, CIFAR-10) and
temporal neuromorphic (N-MNIST, DVS-Gesture) datasets demonstrate that our
method can flexibly learn appropriate compression rate for various tasks and
effectively achieve superior performance while massively reducing the network
energy consumption. Specifically, for the spatial MNIST dataset, our SD-SNN
achieves 99.51\% accuracy at the pruning rate 49.83\%, which has a 0.05\%
accuracy improvement compared to the baseline without compression. For the
neuromorphic DVS-Gesture dataset, 98.20\% accuracy with 1.09\% improvement is
achieved by our method when the compression rate reaches 55.50\%.
Related papers
- Fractional-order spike-timing-dependent gradient descent for multi-layer spiking neural networks [18.142378139047977]
This paper proposes a fractional-order spike-timing-dependent gradient descent (FOSTDGD) learning model.
It is tested on theNIST and DVS128 Gesture datasets and its accuracy under different network structure and fractional orders is analyzed.
arXiv Detail & Related papers (2024-10-20T05:31:34Z) - Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Shrinking Your TimeStep: Towards Low-Latency Neuromorphic Object
Recognition with Spiking Neural Network [5.174808367448261]
Neuromorphic object recognition with spiking neural networks (SNNs) is the cornerstone of low-power neuromorphic computing.
Existing SNNs suffer from significant latency, utilizing 10 to 40 timesteps or more, to recognize neuromorphic objects.
In this work, we propose the Shrinking SNN (SSNN) to achieve low-latency neuromorphic object recognition without reducing performance.
arXiv Detail & Related papers (2024-01-02T02:05:05Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - PC-SNN: Supervised Learning with Local Hebbian Synaptic Plasticity based
on Predictive Coding in Spiking Neural Networks [1.6172800007896282]
We propose a novel learning algorithm inspired by predictive coding theory.
We show that it can perform supervised learning fully autonomously and successfully as the backprop.
This method achieves a favorable performance compared to the state-of-the-art multi-layer SNNs.
arXiv Detail & Related papers (2022-11-24T09:56:02Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - BackEISNN: A Deep Spiking Neural Network with Adaptive Self-Feedback and
Balanced Excitatory-Inhibitory Neurons [8.956708722109415]
Spiking neural networks (SNNs) transmit information through discrete spikes, which performs well in processing spatial-temporal information.
We propose a deep spiking neural network with adaptive self-feedback and balanced excitatory and inhibitory neurons (BackEISNN)
For the MNIST, FashionMNIST, and N-MNIST datasets, our model has achieved state-of-the-art performance.
arXiv Detail & Related papers (2021-05-27T08:38:31Z) - Pruning of Deep Spiking Neural Networks through Gradient Rewiring [41.64961999525415]
Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips.
Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs.
We propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain.
arXiv Detail & Related papers (2021-05-11T10:05:53Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.