KLIF: An optimized spiking neuron unit for tuning surrogate gradient
slope and membrane potential
- URL: http://arxiv.org/abs/2302.09238v1
- Date: Sat, 18 Feb 2023 05:18:18 GMT
- Title: KLIF: An optimized spiking neuron unit for tuning surrogate gradient
slope and membrane potential
- Authors: Chunming Jiang, Yilei Zhang
- Abstract summary: Spiking neural networks (SNNs) have attracted much attention due to their ability to process temporal information.
It is still challenging to develop efficient and high-performing learning algorithms for SNNs.
We propose a novel k-based leaky Integrate-and-Fire neuron model to improve the learning ability of SNNs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking neural networks (SNNs) have attracted much attention due to their
ability to process temporal information, low power consumption, and higher
biological plausibility. However, it is still challenging to develop efficient
and high-performing learning algorithms for SNNs. Methods like artificial
neural network (ANN)-to-SNN conversion can transform ANNs to SNNs with slight
performance loss, but it needs a long simulation to approximate the rate
coding. Directly training SNN by spike-based backpropagation (BP) such as
surrogate gradient approximation is more flexible. Yet now, the performance of
SNNs is not competitive compared with ANNs. In this paper, we propose a novel
k-based leaky Integrate-and-Fire (KLIF) neuron model to improve the learning
ability of SNNs. Compared with the popular leaky integrate-and-fire (LIF)
model, KLIF adds a learnable scaling factor to dynamically update the slope and
width of the surrogate gradient curve during training and incorporates a ReLU
activation function that selectively delivers membrane potential to spike
firing and resetting. The proposed spiking unit is evaluated on both static
MNIST, Fashion-MNIST, CIFAR-10 datasets, as well as neuromorphic N-MNIST,
CIFAR10-DVS, and DVS128-Gesture datasets. Experiments indicate that KLIF
performs much better than LIF without introducing additional computational cost
and achieves state-of-the-art performance on these datasets with few time
steps. Also, KLIF is believed to be more biological plausible than LIF. The
good performance of KLIF can make it completely replace the role of LIF in SNN
for various tasks.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - CLIF: Complementary Leaky Integrate-and-Fire Neuron for Spiking Neural Networks [5.587069105667678]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
It remains a challenge to train SNNs due to their undifferentiable spiking mechanism.
We propose Leaky Integrate-and-Fire Neuron-based SNNs and Complementary Leaky Integrate-and-Fire Neuron.
arXiv Detail & Related papers (2024-02-07T08:51:57Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - Skip Connections in Spiking Neural Networks: An Analysis of Their Effect
on Network Training [0.8602553195689513]
Spiking neural networks (SNNs) have gained attention as a promising alternative to traditional artificial neural networks (ANNs)
In this paper, we study the impact of skip connections on SNNs and propose a hyper parameter optimization technique that adapts models from ANN to SNN.
We demonstrate that optimizing the position, type, and number of skip connections can significantly improve the accuracy and efficiency of SNNs.
arXiv Detail & Related papers (2023-03-23T07:57:32Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Spiking Neural Networks with Improved Inherent Recurrence Dynamics for
Sequential Learning [6.417011237981518]
Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons can be operated in an event-driven manner.
We show that SNNs can be trained for sequential tasks and propose modifications to a network of LIF neurons.
We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics.
arXiv Detail & Related papers (2021-09-04T17:13:28Z) - Learning to Solve the AC-OPF using Sensitivity-Informed Deep Neural
Networks [52.32646357164739]
We propose a deep neural network (DNN) to solve the solutions of the optimal power flow (ACOPF)
The proposed SIDNN is compatible with a broad range of OPF schemes.
It can be seamlessly integrated in other learning-to-OPF schemes.
arXiv Detail & Related papers (2021-03-27T00:45:23Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.