Accurate online training of dynamical spiking neural networks through
Forward Propagation Through Time
- URL: http://arxiv.org/abs/2112.11231v1
- Date: Mon, 20 Dec 2021 13:44:20 GMT
- Title: Accurate online training of dynamical spiking neural networks through
Forward Propagation Through Time
- Authors: Bojian Yin, Federico Corradi, Sander M. Bohte
- Abstract summary: We show how a recently developed alternative to BPTT can be applied in spiking neural networks.
FPTT attempts to minimize an ongoing dynamically regularized risk on the loss.
We show that SNNs trained with FPTT outperform online BPTT approximations, and approach or exceed offline BPTT accuracy on temporal classification tasks.
- Score: 1.8515971640245998
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The event-driven and sparse nature of communication between spiking neurons
in the brain holds great promise for flexible and energy-efficient AI. Recent
advances in learning algorithms have demonstrated that recurrent networks of
spiking neurons can be effectively trained to achieve competitive performance
compared to standard recurrent neural networks. Still, as these learning
algorithms use error-backpropagation through time (BPTT), they suffer from high
memory requirements, are slow to train, and are incompatible with online
learning. This limits the application of these learning algorithms to
relatively small networks and to limited temporal sequence lengths. Online
approximations to BPTT with lower computational and memory complexity have been
proposed (e-prop, OSTL), but in practice also suffer from memory limitations
and, as approximations, do not outperform standard BPTT training. Here, we show
how a recently developed alternative to BPTT, Forward Propagation Through Time
(FPTT) can be applied in spiking neural networks. Different from BPTT, FPTT
attempts to minimize an ongoing dynamically regularized risk on the loss. As a
result, FPTT can be computed in an online fashion and has fixed complexity with
respect to the sequence length. When combined with a novel dynamic spiking
neuron model, the Liquid-Time-Constant neuron, we show that SNNs trained with
FPTT outperform online BPTT approximations, and approach or exceed offline BPTT
accuracy on temporal classification tasks. This approach thus makes it feasible
to train SNNs in a memory-friendly online fashion on long sequences and scale
up SNNs to novel and complex neural architectures.
Related papers
- Asymmetrical estimator for training encapsulated deep photonic neural networks [10.709758849326061]
asymmetrical training (AT) is a BP-based training method that can perform training on an encapsulated deep network.
AT offers significantly improved time and energy efficiency compared to existing BP-PNN methods.
We demonstrate AT's error-tolerant and calibration-free training for encapsulated integrated photonic deep networks.
arXiv Detail & Related papers (2024-05-28T17:27:20Z) - Speed Limits for Deep Learning [67.69149326107103]
Recent advancement in thermodynamics allows bounding the speed at which one can go from the initial weight distribution to the final distribution of the fully trained network.
We provide analytical expressions for these speed limits for linear and linearizable neural networks.
Remarkably, given some plausible scaling assumptions on the NTK spectra and spectral decomposition of the labels -- learning is optimal in a scaling sense.
arXiv Detail & Related papers (2023-07-27T06:59:46Z) - Accelerating SNN Training with Stochastic Parallelizable Spiking Neurons [1.7056768055368383]
Spiking neural networks (SNN) are able to learn features while using less energy, especially on neuromorphic hardware.
Most widely used neuron in deep learning is the temporal and Fire (LIF) neuron.
arXiv Detail & Related papers (2023-06-22T04:25:27Z) - Towards Memory- and Time-Efficient Backpropagation for Training Spiking
Neural Networks [70.75043144299168]
Spiking Neural Networks (SNNs) are promising energy-efficient models for neuromorphic computing.
We propose the Spatial Learning Through Time (SLTT) method that can achieve high performance while greatly improving training efficiency.
Our method achieves state-of-the-art accuracy on ImageNet, while the memory cost and training time are reduced by more than 70% and 50%, respectively, compared with BPTT.
arXiv Detail & Related papers (2023-02-28T05:01:01Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Online Spatio-Temporal Learning in Deep Neural Networks [1.6624384368855523]
Online learning has recently regained the attention of the research community, focusing on approaches that approximate BPTT or on biologically-plausible schemes applied to SNNs.
Here we present an alternative perspective that is based on a clear separation of spatial and temporal gradient components.
We derive from first principles a novel online learning algorithm for deep SNNs, called online spiking-temporal learning (OSTL)
For shallow networks, OSTL is gradient-equivalent to BPTT enabling for the first time online training of SNNs with BPTT-equivalent gradients. In addition, the proposed formulation unveils a class of SNN architectures
arXiv Detail & Related papers (2020-07-24T18:10:18Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Temporal Spike Sequence Learning via Backpropagation for Deep Spiking
Neural Networks [14.992756670960008]
Spiking neural networks (SNNs) are well suited for computation and implementations on energy-efficient event-driven neuromorphic processors.
We present a novel Temporal Spike Sequence Learning Backpropagation (TSSL-BP) method for training deep SNNs.
arXiv Detail & Related papers (2020-02-24T05:49:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.