Advancing Residual Learning towards Powerful Deep Spiking Neural
Networks
- URL: http://arxiv.org/abs/2112.08954v1
- Date: Wed, 15 Dec 2021 05:47:21 GMT
- Title: Advancing Residual Learning towards Powerful Deep Spiking Neural
Networks
- Authors: Yifan Hu, Yujie Wu, Lei Deng, Guoqi Li
- Abstract summary: Residual learning and shortcuts have been evidenced as an important approach for training deep neural networks.
MS-ResNet is able to significantly extend the depth of directly trained SNNs.
MS-ResNet 104 achieves 76.02% accuracy on ImageNet, the first time in the domain of directly trained SNNs.
- Score: 16.559670769601038
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Despite the rapid progress of neuromorphic computing, inadequate capacity and
insufficient representation power of spiking neural networks (SNNs) severely
restrict their application scope in practice. Residual learning and shortcuts
have been evidenced as an important approach for training deep neural networks,
but rarely did previous work assess their applicability to the characteristics
of spike-based communication and spatiotemporal dynamics. In this paper, we
first identify that this negligence leads to impeded information flow and
accompanying degradation problem in previous residual SNNs. Then we propose a
novel SNN-oriented residual block, MS-ResNet, which is able to significantly
extend the depth of directly trained SNNs, e.g. up to 482 layers on CIFAR-10
and 104 layers on ImageNet, without observing any slight degradation problem.
We validate the effectiveness of MS-ResNet on both frame-based and neuromorphic
datasets, and MS-ResNet104 achieves a superior result of 76.02% accuracy on
ImageNet, the first time in the domain of directly trained SNNs. Great energy
efficiency is also observed that on average only one spike per neuron is needed
to classify an input sample. We believe our powerful and scalable models will
provide a strong support for further exploration of SNNs.
Related papers
- Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - PC-SNN: Supervised Learning with Local Hebbian Synaptic Plasticity based
on Predictive Coding in Spiking Neural Networks [1.6172800007896282]
We propose a novel learning algorithm inspired by predictive coding theory.
We show that it can perform supervised learning fully autonomously and successfully as the backprop.
This method achieves a favorable performance compared to the state-of-the-art multi-layer SNNs.
arXiv Detail & Related papers (2022-11-24T09:56:02Z) - Multi-Level Firing with Spiking DS-ResNet: Enabling Better and Deeper
Directly-Trained Spiking Neural Networks [19.490903216456758]
Spiking neural networks (SNNs) are neural networks with asynchronous discrete and sparse characteristics.
We propose a multi-level firing (MLF) method based on the existing spiking-suppressed residual network (spiking DS-ResNet)
arXiv Detail & Related papers (2022-10-12T16:39:46Z) - Attention Spiking Neural Networks [32.591900260554326]
We study the effect of attention mechanisms in spiking neural networks (SNNs)
New attention SNN architecture with end-to-end training called "MA-SNN" is proposed.
Experiments are conducted in event-based DVS128 Gesture/Gait action recognition and ImageNet-1k image classification.
arXiv Detail & Related papers (2022-09-28T09:00:45Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z) - Advancing Deep Residual Learning by Solving the Crux of Degradation in
Spiking Neural Networks [21.26300397341615]
Residual learning and shortcuts have been evidenced as an important approach for training deep neural networks.
This paper proposes a novel residual block for SNNs, which is able to significantly extend the depth of directly trained SNNs.
arXiv Detail & Related papers (2021-12-09T06:29:00Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.