Training Energy-Efficient Deep Spiking Neural Networks with
Time-to-First-Spike Coding
- URL: http://arxiv.org/abs/2106.02568v1
- Date: Fri, 4 Jun 2021 16:02:27 GMT
- Title: Training Energy-Efficient Deep Spiking Neural Networks with
Time-to-First-Spike Coding
- Authors: Seongsik Park, Sungroh Yoon
- Abstract summary: Spiking neural networks (SNNs) mimic the operations in the human brain.
Deep neural networks (DNNs) have become a serious problem in deep learning.
This paper presents training methods for energy-efficient deep SNNs with TTFS coding.
- Score: 29.131030799324844
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The tremendous energy consumption of deep neural networks (DNNs) has become a
serious problem in deep learning. Spiking neural networks (SNNs), which mimic
the operations in the human brain, have been studied as prominent
energy-efficient neural networks. Due to their event-driven and
spatiotemporally sparse operations, SNNs show possibilities for
energy-efficient processing. To unlock their potential, deep SNNs have adopted
temporal coding such as time-to-first-spike (TTFS)coding, which represents the
information between neurons by the first spike time. With TTFS coding, each
neuron generates one spike at most, which leads to a significant improvement in
energy efficiency. Several studies have successfully introduced TTFS coding in
deep SNNs, but they showed restricted efficiency improvement owing to the lack
of consideration for efficiency during training. To address the aforementioned
issue, this paper presents training methods for energy-efficient deep SNNs with
TTFS coding. We introduce a surrogate DNN model to train the deep SNN in a
feasible time and analyze the effect of the temporal kernel on training
performance and efficiency. Based on the investigation, we propose
stochastically relaxed activation and initial value-based regularization for
the temporal kernel parameters. In addition, to reduce the number of spikes
even further, we present temporal kernel-aware batch normalization. With the
proposed methods, we could achieve comparable training results with
significantly reduced spikes, which could lead to energy-efficient deep SNNs.
Related papers
- ETTFS: An Efficient Training Framework for Time-to-First-Spike Neuron [38.194529226257735]
Time-to-First-Spike (TTFS) coding, where neurons fire only once during inference, offers the benefits of reduced spike counts, enhanced energy efficiency, and faster processing.
This paper presents an efficient training framework for TTFS that not only improves accuracy but also accelerates the training process.
arXiv Detail & Related papers (2024-10-31T04:14:47Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - High-performance deep spiking neural networks with 0.3 spikes per neuron [9.01407445068455]
It is hard to train biologically-inspired spiking neural networks (SNNs) than artificial neural networks (ANNs)
We show that training deep SNN models achieves the exact same performance as that of ANNs.
Our SNN accomplishes high-performance classification with less than 0.3 spikes per neuron, lending itself for an energy-efficient implementation.
arXiv Detail & Related papers (2023-06-14T21:01:35Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Deep Reinforcement Learning with Spiking Q-learning [51.386945803485084]
spiking neural networks (SNNs) are expected to realize artificial intelligence (AI) with less energy consumption.
It provides a promising energy-efficient way for realistic control tasks by combining SNNs with deep reinforcement learning (RL)
arXiv Detail & Related papers (2022-01-21T16:42:11Z) - Revisiting Batch Normalization for Training Low-latency Deep Spiking
Neural Networks from Scratch [5.511606249429581]
Spiking Neural Networks (SNNs) have emerged as an alternative to deep learning.
High-accuracy and low-latency SNNs from scratch suffer from non-differentiable nature of a spiking neuron.
We propose a temporal Batch Normalization Through Time (BNTT) technique for training temporal SNNs.
arXiv Detail & Related papers (2020-10-05T00:49:30Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z) - T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding [26.654533157221973]
This paper introduces the concept of time-to-first-spike coding into deep SNNs using the kernel-based dynamic threshold and dendrite to overcome the drawback.
According to our results, the proposed methods can reduce inference latency and number of spikes to 22% and less than 1%, compared to those of burst coding.
arXiv Detail & Related papers (2020-03-26T04:39:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.