Incorporating Learnable Membrane Time Constant to Enhance Learning of
Spiking Neural Networks
- URL: http://arxiv.org/abs/2007.05785v5
- Date: Tue, 17 Aug 2021 03:04:45 GMT
- Title: Incorporating Learnable Membrane Time Constant to Enhance Learning of
Spiking Neural Networks
- Authors: Wei Fang, Zhaofei Yu, Yanqi Chen, Timothee Masquelier, Tiejun Huang,
Yonghong Tian
- Abstract summary: Spiking Neural Networks (SNNs) have attracted enormous research interest due to temporal information processing capability, low power consumption, and high biological plausibility.
Most existing learning methods learn weights only, and require manual tuning of the membrane-related parameters that determine the dynamics of a single spiking neuron.
In this paper, we take inspiration from the observation that membrane-related parameters are different across brain regions, and propose a training algorithm that is capable of learning not only the synaptic weights but also the membrane time constants of SNNs.
- Score: 36.16846259899793
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) have attracted enormous research interest due
to temporal information processing capability, low power consumption, and high
biological plausibility. However, the formulation of efficient and
high-performance learning algorithms for SNNs is still challenging. Most
existing learning methods learn weights only, and require manual tuning of the
membrane-related parameters that determine the dynamics of a single spiking
neuron. These parameters are typically chosen to be the same for all neurons,
which limits the diversity of neurons and thus the expressiveness of the
resulting SNNs. In this paper, we take inspiration from the observation that
membrane-related parameters are different across brain regions, and propose a
training algorithm that is capable of learning not only the synaptic weights
but also the membrane time constants of SNNs. We show that incorporating
learnable membrane time constants can make the network less sensitive to
initial values and can speed up learning. In addition, we reevaluate the
pooling methods in SNNs and find that max-pooling will not lead to significant
information loss and have the advantage of low computation cost and binary
compatibility. We evaluate the proposed method for image classification tasks
on both traditional static MNIST, Fashion-MNIST, CIFAR-10 datasets, and
neuromorphic N-MNIST, CIFAR10-DVS, DVS128 Gesture datasets. The experiment
results show that the proposed method outperforms the state-of-the-art accuracy
on nearly all datasets, using fewer time-steps. Our codes are available at
https://github.com/fangwei123456/Parametric-Leaky-Integrate-and-Fire-Spiking-Neuron.
Related papers
- CLIF: Complementary Leaky Integrate-and-Fire Neuron for Spiking Neural Networks [5.587069105667678]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
It remains a challenge to train SNNs due to their undifferentiable spiking mechanism.
We propose Leaky Integrate-and-Fire Neuron-based SNNs and Complementary Leaky Integrate-and-Fire Neuron.
arXiv Detail & Related papers (2024-02-07T08:51:57Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - MSAT: Biologically Inspired Multi-Stage Adaptive Threshold for
Conversion of Spiking Neural Networks [11.392893261073594]
Spiking Neural Networks (SNNs) can do inference with low power consumption due to their spike sparsity.
ANN-SNN conversion is an efficient way to achieve deep SNNs by converting well-trained Artificial Neural Networks (ANNs)
Existing methods commonly use constant threshold for conversion, which prevents neurons from rapidly delivering spikes to deeper layers.
arXiv Detail & Related papers (2023-03-23T07:18:08Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - N-Omniglot: a Large-scale Neuromorphic Dataset for Spatio-Temporal
Sparse Few-shot Learning [10.812738608234321]
We provide the first neuromorphic dataset: N- Omniglot, using the Dynamic Vision Sensor (DVS)
It contains 1623 categories of handwritten characters, with only 20 samples per class.
The dataset provides a powerful challenge and a suitable benchmark for developing SNNs algorithm in the few-shot learning domain.
arXiv Detail & Related papers (2021-12-25T12:41:34Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Spiking Neural Networks with Improved Inherent Recurrence Dynamics for
Sequential Learning [6.417011237981518]
Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons can be operated in an event-driven manner.
We show that SNNs can be trained for sequential tasks and propose modifications to a network of LIF neurons.
We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics.
arXiv Detail & Related papers (2021-09-04T17:13:28Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - Exploiting Neuron and Synapse Filter Dynamics in Spatial Temporal
Learning of Deep Spiking Neural Network [7.503685643036081]
A bio-plausible SNN model with spatial-temporal property is a complex dynamic system.
We formulate SNN as a network of infinite impulse response (IIR) filters with neuron nonlinearity.
We propose a training algorithm that is capable to learn spatial-temporal patterns by searching for the optimal synapse filter kernels and weights.
arXiv Detail & Related papers (2020-02-19T01:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.