ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural
Networks
- URL: http://arxiv.org/abs/2306.03693v1
- Date: Tue, 6 Jun 2023 14:06:11 GMT
- Title: ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural
Networks
- Authors: Jiangrong Shen, Qi Xu, Jian K. Liu, Yueming Wang, Gang Pan, Huajin
Tang
- Abstract summary: Spiking neural networks (SNNs) have manifested remarkable advantages in power consumption and event-driven property during the inference process.
We propose an efficient evolutionary structure learning framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from scratch.
Our work presents a brand-new approach for sparse training of SNNs from scratch with biologically plausible evolutionary mechanisms.
- Score: 20.33499499020257
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) have manifested remarkable advantages in power
consumption and event-driven property during the inference process. To take
full advantage of low power consumption and improve the efficiency of these
models further, the pruning methods have been explored to find sparse SNNs
without redundancy connections after training. However, parameter redundancy
still hinders the efficiency of SNNs during training. In the human brain, the
rewiring process of neural networks is highly dynamic, while synaptic
connections maintain relatively sparse during brain development. Inspired by
this, here we propose an efficient evolutionary structure learning (ESL)
framework for SNNs, named ESL-SNNs, to implement the sparse SNN training from
scratch. The pruning and regeneration of synaptic connections in SNNs evolve
dynamically during learning, yet keep the structural sparsity at a certain
level. As a result, the ESL-SNNs can search for optimal sparse connectivity by
exploring all possible parameters across time. Our experiments show that the
proposed ESL-SNNs framework is able to learn SNNs with sparse structures
effectively while reducing the limited accuracy. The ESL-SNNs achieve merely
0.28% accuracy loss with 10% connection density on the DVS-Cifar10 dataset. Our
work presents a brand-new approach for sparse training of SNNs from scratch
with biologically plausible evolutionary mechanisms, closing the gap in the
expressibility between sparse training and dense training. Hence, it has great
potential for SNN lightweight training and inference with low power consumption
and small memory usage.
Related papers
- High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Training Spiking Neural Networks with Local Tandem Learning [96.32026780517097]
Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient than their predecessors.
In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL)
We demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity.
arXiv Detail & Related papers (2022-10-10T10:05:00Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Pruning of Deep Spiking Neural Networks through Gradient Rewiring [41.64961999525415]
Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips.
Most existing methods directly apply pruning approaches in artificial neural networks (ANNs) to SNNs, which ignore the difference between ANNs and SNNs.
We propose gradient rewiring (Grad R), a joint learning algorithm of connectivity and weight for SNNs, that enables us to seamlessly optimize network structure without retrain.
arXiv Detail & Related papers (2021-05-11T10:05:53Z) - Optimal Conversion of Conventional Artificial Neural Networks to Spiking
Neural Networks [0.0]
Spiking neural networks (SNNs) are biology-inspired artificial neural networks (ANNs)
We propose a novel strategic pipeline that transfers the weights to the target SNN by combining threshold balance and soft-reset mechanisms.
Our method is promising to get implanted onto embedded platforms with better support of SNNs with limited energy and memory.
arXiv Detail & Related papers (2021-02-28T12:04:22Z) - Long Short-Term Memory Spiking Networks and Their Applications [10.071615423169902]
We present a novel framework for training recurrent spiking neural networks (SNNs)
We show that LSTM spiking networks learn the timing of the spikes and temporal dependencies.
We also develop a methodology for error backpropagation within LSTM-based SNNs.
arXiv Detail & Related papers (2020-07-09T13:22:27Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.