Accelerating spiking neural network training
- URL: http://arxiv.org/abs/2205.15286v1
- Date: Mon, 30 May 2022 17:48:14 GMT
- Title: Accelerating spiking neural network training
- Authors: Luke Taylor, Andrew King, Nicol Harper
- Abstract summary: Spiking neural networks (SNN) are a type of artificial network inspired by the use of action potentials in the brain.
We propose a new technique for directly training single-spike-per-neur-on SNNs which eliminates all sequential computation and relies exclusively on vectorised operations.
Our proposed solution manages to solve certain tasks with over a $95.68 %$ reduction in spike counts relative to a conventionally trained SNN, which could significantly reduce energy requirements when deployed on neuromorphic computers.
- Score: 1.6114012813668934
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNN) are a type of artificial network inspired by
the use of action potentials in the brain. There is a growing interest in
emulating these networks on neuromorphic computers due to their improved energy
consumption and speed, which are the main scaling issues of their counterpart
the artificial neural network (ANN). Significant progress has been made in
directly training SNNs to perform on par with ANNs in terms of accuracy. These
methods are however slow due to their sequential nature, leading to long
training times. We propose a new technique for directly training
single-spike-per-neuron SNNs which eliminates all sequential computation and
relies exclusively on vectorised operations. We demonstrate over a $\times 10$
speedup in training with robust classification performance on real datasets of
low to medium spatio-temporal complexity (Fashion-MNIST and Neuromophic-MNIST).
Our proposed solution manages to solve certain tasks with over a $95.68 \%$
reduction in spike counts relative to a conventionally trained SNN, which could
significantly reduce energy requirements when deployed on neuromorphic
computers.
Related papers
- Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Accelerating SNN Training with Stochastic Parallelizable Spiking Neurons [1.7056768055368383]
Spiking neural networks (SNN) are able to learn features while using less energy, especially on neuromorphic hardware.
Most widely used neuron in deep learning is the temporal and Fire (LIF) neuron.
arXiv Detail & Related papers (2023-06-22T04:25:27Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - Sparse Spiking Gradient Descent [2.741266294612776]
We present the first sparse SNN backpropagation algorithm which achieves the same or better accuracy as current state of the art methods.
We show the effectiveness of our method on real datasets of varying complexity.
arXiv Detail & Related papers (2021-05-18T20:00:55Z) - Combining Spiking Neural Network and Artificial Neural Network for
Enhanced Image Classification [1.8411688477000185]
spiking neural networks (SNNs) that more closely resemble biological brain synapses have attracted attention owing to their low power consumption.
We build versatile hybrid neural networks (HNNs) that improve the concerned performance.
arXiv Detail & Related papers (2021-02-21T12:03:16Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - Effective and Efficient Computation with Multiple-timescale Spiking
Recurrent Neural Networks [0.9790524827475205]
We show how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance.
We calculate a $>$100x energy improvement for our SRNNs over classical RNNs on the harder tasks.
arXiv Detail & Related papers (2020-05-24T01:04:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.