Online Spatio-Temporal Learning in Deep Neural Networks
- URL: http://arxiv.org/abs/2007.12723v2
- Date: Thu, 8 Oct 2020 12:54:20 GMT
- Title: Online Spatio-Temporal Learning in Deep Neural Networks
- Authors: Thomas Bohnstingl, Stanis{\l}aw Wo\'zniak, Wolfgang Maass, Angeliki
Pantazi and Evangelos Eleftheriou
- Abstract summary: Online learning has recently regained the attention of the research community, focusing on approaches that approximate BPTT or on biologically-plausible schemes applied to SNNs.
Here we present an alternative perspective that is based on a clear separation of spatial and temporal gradient components.
We derive from first principles a novel online learning algorithm for deep SNNs, called online spiking-temporal learning (OSTL)
For shallow networks, OSTL is gradient-equivalent to BPTT enabling for the first time online training of SNNs with BPTT-equivalent gradients. In addition, the proposed formulation unveils a class of SNN architectures
- Score: 1.6624384368855523
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Biological neural networks are equipped with an inherent capability to
continuously adapt through online learning. This aspect remains in stark
contrast to learning with error backpropagation through time (BPTT) applied to
recurrent neural networks (RNNs), or recently to biologically-inspired spiking
neural networks (SNNs). BPTT involves offline computation of the gradients due
to the requirement to unroll the network through time. Online learning has
recently regained the attention of the research community, focusing either on
approaches that approximate BPTT or on biologically-plausible schemes applied
to SNNs. Here we present an alternative perspective that is based on a clear
separation of spatial and temporal gradient components. Combined with insights
from biology, we derive from first principles a novel online learning algorithm
for deep SNNs, called online spatio-temporal learning (OSTL). For shallow
networks, OSTL is gradient-equivalent to BPTT enabling for the first time
online training of SNNs with BPTT-equivalent gradients. In addition, the
proposed formulation unveils a class of SNN architectures trainable online at
low time complexity. Moreover, we extend OSTL to a generic form, applicable to
a wide range of network architectures, including networks comprising long
short-term memory (LSTM) and gated recurrent units (GRU). We demonstrate the
operation of our algorithm on various tasks from language modelling to speech
recognition and obtain results on par with the BPTT baselines. The proposed
algorithm provides a framework for developing succinct and efficient online
training approaches for SNNs and in general deep RNNs.
Related papers
- Topological Representations of Heterogeneous Learning Dynamics of Recurrent Spiking Neural Networks [16.60622265961373]
Spiking Neural Networks (SNNs) have become an essential paradigm in neuroscience and artificial intelligence.
Recent advances in literature have studied the network representations of deep neural networks.
arXiv Detail & Related papers (2024-03-19T05:37:26Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Accurate online training of dynamical spiking neural networks through
Forward Propagation Through Time [1.8515971640245998]
We show how a recently developed alternative to BPTT can be applied in spiking neural networks.
FPTT attempts to minimize an ongoing dynamically regularized risk on the loss.
We show that SNNs trained with FPTT outperform online BPTT approximations, and approach or exceed offline BPTT accuracy on temporal classification tasks.
arXiv Detail & Related papers (2021-12-20T13:44:20Z) - Brain-Inspired Learning on Neuromorphic Substrates [5.279475826661643]
This article provides a mathematical framework for the design of practical online learning algorithms for neuromorphic substrates.
Specifically, we show a direct connection between Real-Time Recurrent Learning (RTRL) and biologically plausible learning rules for training Spiking Neural Networks (SNNs)
We motivate a sparse approximation based on block-diagonal Jacobians, which reduces the algorithm's computational complexity.
arXiv Detail & Related papers (2020-10-22T17:56:59Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.