Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State
- URL: http://arxiv.org/abs/2109.14247v1
- Date: Wed, 29 Sep 2021 07:46:54 GMT
- Title: Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State
- Authors: Mingqing Xiao, Qingyan Meng, Zongpeng Zhang, Yisen Wang, Zhouchen Lin
- Abstract summary: Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
- Score: 66.2457134675891
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) are brain-inspired models that enable
energy-efficient implementation on neuromorphic hardware. However, the
supervised training of SNNs remains a hard problem due to the discontinuity of
the spiking neuron model. Most existing methods imitate the backpropagation
framework and feedforward architectures for artificial neural networks, and use
surrogate derivatives or compute gradients with respect to the spiking time to
deal with the problem. These approaches either accumulate approximation errors
or only propagate information limitedly through existing spikes, and usually
require information propagation along time steps with large memory costs and
biological implausibility. In this work, we consider feedback spiking neural
networks, which are more brain-like, and propose a novel training method that
does not rely on the exact reverse of the forward computation. First, we show
that the average firing rates of SNNs with feedback connections would gradually
evolve to an equilibrium state along time, which follows a fixed-point
equation. Then by viewing the forward computation of feedback SNNs as a
black-box solver for this equation, and leveraging the implicit differentiation
on the equation, we can compute the gradient for parameters without considering
the exact forward procedure. In this way, the forward and backward procedures
are decoupled and therefore the problem of non-differentiable spiking functions
is avoided. We also briefly discuss the biological plausibility of implicit
differentiation, which only requires computing another equilibrium. Extensive
experiments on MNIST, Fashion-MNIST, N-MNIST, CIFAR-10, and CIFAR-100
demonstrate the superior performance of our method for feedback models with
fewer neurons and parameters in a small number of time steps. Our code is
avaiable at https://github.com/pkuxmq/IDE-FSNN.
Related papers
- Temporal Spiking Neural Networks with Synaptic Delay for Graph Reasoning [91.29876772547348]
Spiking neural networks (SNNs) are investigated as biologically inspired models of neural computation.
This paper reveals that SNNs, when amalgamated with synaptic delay and temporal coding, are proficient in executing (knowledge) graph reasoning.
arXiv Detail & Related papers (2024-05-27T05:53:30Z) - Energy Efficient Training of SNN using Local Zeroth Order Method [18.81001891391638]
Spiking neural networks are becoming increasingly popular for their low energy requirement in real-world tasks.
SNN training algorithms face the loss of gradient information and non-differentiability due to the Heaviside function.
We propose a differentiable approximation of the Heaviside in the backward pass, while the forward pass uses the Heaviside as the spiking function.
arXiv Detail & Related papers (2023-02-02T06:57:37Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Desire Backpropagation: A Lightweight Training Algorithm for Multi-Layer
Spiking Neural Networks based on Spike-Timing-Dependent Plasticity [13.384228628766236]
Spiking neural networks (SNNs) are a viable alternative to conventional artificial neural networks.
We present desire backpropagation, a method to derive the desired spike activity of all neurons, including the hidden ones.
We trained three-layer networks to classify MNIST and Fashion-MNIST images and reached an accuracy of 98.41% and 87.56%, respectively.
arXiv Detail & Related papers (2022-11-10T08:32:13Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - Spiking Neural Networks -- Part II: Detecting Spatio-Temporal Patterns [38.518936229794214]
Spiking Neural Networks (SNNs) have the unique ability to detect information in encoded-temporal signals.
We review models and training algorithms for the dominant approach that considers SNNs as a Recurrent Neural Network (RNN)
We describe an alternative approach that relies on probabilistic models for spiking neurons, allowing the derivation of local learning rules via gradient estimates.
arXiv Detail & Related papers (2020-10-27T11:47:42Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.