Estimating Post-Synaptic Effects for Online Training of Feed-Forward
SNNs
- URL: http://arxiv.org/abs/2311.16151v1
- Date: Tue, 7 Nov 2023 16:53:39 GMT
- Title: Estimating Post-Synaptic Effects for Online Training of Feed-Forward
SNNs
- Authors: Thomas Summe, Clemens JS Schaefer, Siddharth Joshi
- Abstract summary: Facilitating online learning in spiking neural networks (SNNs) is a key step in developing event-based models.
We propose Online Training with Postsynaptic Estimates (OTPE) for training feed-forward SNNs.
We show improved scaling for multi-layer networks using a novel approximation of temporal effects on the subsequent layer's activity.
- Score: 0.27016900604393124
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Facilitating online learning in spiking neural networks (SNNs) is a key step
in developing event-based models that can adapt to changing environments and
learn from continuous data streams in real-time. Although forward-mode
differentiation enables online learning, its computational requirements
restrict scalability. This is typically addressed through approximations that
limit learning in deep models. In this study, we propose Online Training with
Postsynaptic Estimates (OTPE) for training feed-forward SNNs, which
approximates Real-Time Recurrent Learning (RTRL) by incorporating temporal
dynamics not captured by current approximations, such as Online Training
Through Time (OTTT) and Online Spatio-Temporal Learning (OSTL). We show
improved scaling for multi-layer networks using a novel approximation of
temporal effects on the subsequent layer's activity. This approximation incurs
minimal overhead in the time and space complexity compared to similar
algorithms, and the calculation of temporal effects remains local to each
layer. We characterize the learning performance of our proposed algorithms on
multiple SNN model configurations for rate-based and time-based encoding. OTPE
exhibits the highest directional alignment to exact gradients, calculated with
backpropagation through time (BPTT), in deep networks and, on time-based
encoding, outperforms other approximate methods. We also observe sizeable gains
in average performance over similar algorithms in offline training of Spiking
Heidelberg Digits with equivalent hyper-parameters (OTTT/OSTL - 70.5%; OTPE -
75.2%; BPTT - 78.1%).
Related papers
- Comprehensive Online Training and Deployment for Spiking Neural Networks [40.255762156745405]
Spiking Neural Networks (SNNs) are considered to have enormous potential in the future development of Artificial Intelligence (AI)
The current proposed online training methods cannot tackle the inseparability problem of temporal dependent gradients.
We propose Efficient Multi-Precision Firing (EM-PF) model, which is a family of advanced spiking models based on floating-point spikes and binary synaptic weights.
arXiv Detail & Related papers (2024-10-10T02:39:22Z) - S-TLLR: STDP-inspired Temporal Local Learning Rule for Spiking Neural Networks [7.573297026523597]
Spiking Neural Networks (SNNs) are biologically plausible models that have been identified as potentially apt for deploying energy-efficient intelligence at the edge.
We propose S-TLLR, a novel three-factor temporal local learning rule inspired by the Spike-Timing Dependent Plasticity (STDP) mechanism.
S-TLLR is designed to have low memory and time complexities, which are independent of the number of time steps, rendering it suitable for online learning on low-power edge devices.
arXiv Detail & Related papers (2023-06-27T05:44:56Z) - Towards Memory- and Time-Efficient Backpropagation for Training Spiking
Neural Networks [70.75043144299168]
Spiking Neural Networks (SNNs) are promising energy-efficient models for neuromorphic computing.
We propose the Spatial Learning Through Time (SLTT) method that can achieve high performance while greatly improving training efficiency.
Our method achieves state-of-the-art accuracy on ImageNet, while the memory cost and training time are reduced by more than 70% and 50%, respectively, compared with BPTT.
arXiv Detail & Related papers (2023-02-28T05:01:01Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Accurate online training of dynamical spiking neural networks through
Forward Propagation Through Time [1.8515971640245998]
We show how a recently developed alternative to BPTT can be applied in spiking neural networks.
FPTT attempts to minimize an ongoing dynamically regularized risk on the loss.
We show that SNNs trained with FPTT outperform online BPTT approximations, and approach or exceed offline BPTT accuracy on temporal classification tasks.
arXiv Detail & Related papers (2021-12-20T13:44:20Z) - Online Spatio-Temporal Learning in Deep Neural Networks [1.6624384368855523]
Online learning has recently regained the attention of the research community, focusing on approaches that approximate BPTT or on biologically-plausible schemes applied to SNNs.
Here we present an alternative perspective that is based on a clear separation of spatial and temporal gradient components.
We derive from first principles a novel online learning algorithm for deep SNNs, called online spiking-temporal learning (OSTL)
For shallow networks, OSTL is gradient-equivalent to BPTT enabling for the first time online training of SNNs with BPTT-equivalent gradients. In addition, the proposed formulation unveils a class of SNN architectures
arXiv Detail & Related papers (2020-07-24T18:10:18Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Understanding the Effects of Data Parallelism and Sparsity on Neural
Network Training [126.49572353148262]
We study two factors in neural network training: data parallelism and sparsity.
Despite their promising benefits, understanding of their effects on neural network training remains elusive.
arXiv Detail & Related papers (2020-03-25T10:49:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.