Sparse-firing regularization methods for spiking neural networks with
time-to-first spike coding
- URL: http://arxiv.org/abs/2307.13007v1
- Date: Mon, 24 Jul 2023 11:55:49 GMT
- Title: Sparse-firing regularization methods for spiking neural networks with
time-to-first spike coding
- Authors: Yusuke Sakemi, Kakei Yamamoto, Takeo Hosomi, Kazuyuki Aihara
- Abstract summary: We propose two spike timing-based sparse-firing (SSR) regularization methods to further reduce the firing frequency of TTFS-coded neural networks.
The effects of these regularization methods were investigated on the MNIST, Fashion-MNIST, and CIFAR-10 datasets.
- Score: 2.5234156040689237
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The training of multilayer spiking neural networks (SNNs) using the error
backpropagation algorithm has made significant progress in recent years. Among
the various training schemes, the error backpropagation method that directly
uses the firing time of neurons has attracted considerable attention because it
can realize ideal temporal coding. This method uses time-to-first spike (TTFS)
coding, in which each neuron fires at most once, and this restriction on the
number of firings enables information to be processed at a very low firing
frequency. This low firing frequency increases the energy efficiency of
information processing in SNNs, which is important not only because of its
similarity with information processing in the brain, but also from an
engineering point of view. However, only an upper limit has been provided for
TTFS-coded SNNs, and the information-processing capability of SNNs at lower
firing frequencies has not been fully investigated. In this paper, we propose
two spike timing-based sparse-firing (SSR) regularization methods to further
reduce the firing frequency of TTFS-coded SNNs. The first is the membrane
potential-aware SSR (M-SSR) method, which has been derived as an extreme form
of the loss function of the membrane potential value. The second is the firing
condition-aware SSR (F-SSR) method, which is a regularization function obtained
from the firing conditions. Both methods are characterized by the fact that
they only require information about the firing timing and associated weights.
The effects of these regularization methods were investigated on the MNIST,
Fashion-MNIST, and CIFAR-10 datasets using multilayer perceptron networks and
convolutional neural network structures.
Related papers
- ETTFS: An Efficient Training Framework for Time-to-First-Spike Neuron [38.194529226257735]
Time-to-First-Spike (TTFS) coding, where neurons fire only once during inference, offers the benefits of reduced spike counts, enhanced energy efficiency, and faster processing.
This paper presents an efficient training framework for TTFS that not only improves accuracy but also accelerates the training process.
arXiv Detail & Related papers (2024-10-31T04:14:47Z) - Deep Multi-Threshold Spiking-UNet for Image Processing [51.88730892920031]
This paper introduces the novel concept of Spiking-UNet for image processing, which combines the power of Spiking Neural Networks (SNNs) with the U-Net architecture.
To achieve an efficient Spiking-UNet, we face two primary challenges: ensuring high-fidelity information propagation through the network via spikes and formulating an effective training strategy.
Experimental results show that, on image segmentation and denoising, our Spiking-UNet achieves comparable performance to its non-spiking counterpart.
arXiv Detail & Related papers (2023-07-20T16:00:19Z) - SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural
Networks [56.35403810762512]
Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware.
We study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method.
arXiv Detail & Related papers (2023-02-01T04:22:59Z) - Timing-Based Backpropagation in Spiking Neural Networks Without
Single-Spike Restrictions [2.8360662552057323]
We propose a novel backpropagation algorithm for training spiking neural networks (SNNs)
It encodes information in the relative multiple spike timing of individual neurons without single-spike restrictions.
arXiv Detail & Related papers (2022-11-29T11:38:33Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural
Networks [5.986408771459261]
Biological spiking neural networks (SNNs) can temporally encode information in their outputs, whereas artificial neural networks (ANNs) conventionally do not.
Here we show that temporal coding such as rank coding (RC) inspired by SNNs can also be applied to conventional ANNs such as LSTMs.
RC-training also significantly reduces time-to-insight during inference, with a minimal decrease in accuracy.
We demonstrate these in two toy problems of sequence classification, and in a temporally-encoded MNIST dataset where our RC model achieves 99.19% accuracy after the first input time-step
arXiv Detail & Related papers (2021-10-06T15:51:38Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Revisiting Batch Normalization for Training Low-latency Deep Spiking
Neural Networks from Scratch [5.511606249429581]
Spiking Neural Networks (SNNs) have emerged as an alternative to deep learning.
High-accuracy and low-latency SNNs from scratch suffer from non-differentiable nature of a spiking neuron.
We propose a temporal Batch Normalization Through Time (BNTT) technique for training temporal SNNs.
arXiv Detail & Related papers (2020-10-05T00:49:30Z) - Incorporating Learnable Membrane Time Constant to Enhance Learning of
Spiking Neural Networks [36.16846259899793]
Spiking Neural Networks (SNNs) have attracted enormous research interest due to temporal information processing capability, low power consumption, and high biological plausibility.
Most existing learning methods learn weights only, and require manual tuning of the membrane-related parameters that determine the dynamics of a single spiking neuron.
In this paper, we take inspiration from the observation that membrane-related parameters are different across brain regions, and propose a training algorithm that is capable of learning not only the synaptic weights but also the membrane time constants of SNNs.
arXiv Detail & Related papers (2020-07-11T14:35:42Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.