Improving Stability and Performance of Spiking Neural Networks through
Enhancing Temporal Consistency
- URL: http://arxiv.org/abs/2305.14174v1
- Date: Tue, 23 May 2023 15:50:07 GMT
- Title: Improving Stability and Performance of Spiking Neural Networks through
Enhancing Temporal Consistency
- Authors: Dongcheng Zhao, Guobin Shen, Yiting Dong, Yang Li, Yi Zeng
- Abstract summary: Spiking neural networks have gained significant attention due to their brain-like information processing capabilities.
Current training algorithms tend to overlook the differences in output distribution at various timesteps.
We have designed a method to enhance the temporal consistency of outputs at different timesteps.
- Score: 9.545711665562715
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks have gained significant attention due to their
brain-like information processing capabilities. The use of surrogate gradients
has made it possible to train spiking neural networks with backpropagation,
leading to impressive performance in various tasks. However, spiking neural
networks trained with backpropagation typically approximate actual labels using
the average output, often necessitating a larger simulation timestep to enhance
the network's performance. This delay constraint poses a challenge to the
further advancement of SNNs. Current training algorithms tend to overlook the
differences in output distribution at various timesteps. Particularly for
neuromorphic datasets, inputs at different timesteps can cause inconsistencies
in output distribution, leading to a significant deviation from the optimal
direction when combining optimization directions from different moments. To
tackle this issue, we have designed a method to enhance the temporal
consistency of outputs at different timesteps. We have conducted experiments on
static datasets such as CIFAR10, CIFAR100, and ImageNet. The results
demonstrate that our algorithm can achieve comparable performance to other
optimal SNN algorithms. Notably, our algorithm has achieved state-of-the-art
performance on neuromorphic datasets DVS-CIFAR10 and N-Caltech101, and can
achieve superior performance in the test phase with timestep T=1.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Direct Training Needs Regularisation: Anytime Optimal Inference Spiking Neural Network [23.434563009813218]
Spiking Neural Network (SNN) is acknowledged as the next generation of Artificial Neural Network (ANN)
We introduce a novel regularisation technique, namely Spatial-Temporal Regulariser (STR)
STR regulates the ratio between the strength of spikes and membrane potential at each timestep.
This effectively balances spatial and temporal performance during training, ultimately resulting in an Anytime Optimal Inference (AOI) SNN.
arXiv Detail & Related papers (2024-04-15T15:57:01Z) - Accelerating SNN Training with Stochastic Parallelizable Spiking Neurons [1.7056768055368383]
Spiking neural networks (SNN) are able to learn features while using less energy, especially on neuromorphic hardware.
Most widely used neuron in deep learning is the temporal and Fire (LIF) neuron.
arXiv Detail & Related papers (2023-06-22T04:25:27Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Backpropagation with Biologically Plausible Spatio-Temporal Adjustment
For Training Deep Spiking Neural Networks [5.484391472233163]
The success of deep learning is inseparable from backpropagation.
We propose a biological plausible spatial adjustment, which rethinks the relationship between membrane potential and spikes.
Secondly, we propose a biologically plausible temporal adjustment making the error propagate across the spikes in the temporal dimension.
arXiv Detail & Related papers (2021-10-17T15:55:51Z) - Spiking Neural Networks with Improved Inherent Recurrence Dynamics for
Sequential Learning [6.417011237981518]
Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons can be operated in an event-driven manner.
We show that SNNs can be trained for sequential tasks and propose modifications to a network of LIF neurons.
We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics.
arXiv Detail & Related papers (2021-09-04T17:13:28Z) - Revisiting Batch Normalization for Training Low-latency Deep Spiking
Neural Networks from Scratch [5.511606249429581]
Spiking Neural Networks (SNNs) have emerged as an alternative to deep learning.
High-accuracy and low-latency SNNs from scratch suffer from non-differentiable nature of a spiking neuron.
We propose a temporal Batch Normalization Through Time (BNTT) technique for training temporal SNNs.
arXiv Detail & Related papers (2020-10-05T00:49:30Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.