Temporal Knowledge Sharing enable Spiking Neural Network Learning from
Past and Future
- URL: http://arxiv.org/abs/2304.06540v2
- Date: Mon, 23 Oct 2023 04:31:46 GMT
- Title: Temporal Knowledge Sharing enable Spiking Neural Network Learning from
Past and Future
- Authors: Yiting Dong, Dongcheng Zhao, Yi Zeng
- Abstract summary: Spiking Neural Networks (SNNs) have attracted significant attention from researchers across various domains due to their brain-like information processing mechanism.
However, SNNs typically grapple with challenges such as extended time steps, low temporal information utilization, and the requirement for consistent time step between testing and training.
This paper proposes a novel perspective, viewing the SNN as a temporal aggregation model.
We introduce the Temporal Knowledge Sharing (TKS) method, facilitating information interact between different time points.
- Score: 7.300220260073691
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking Neural Networks (SNNs) have attracted significant attention from
researchers across various domains due to their brain-like information
processing mechanism. However, SNNs typically grapple with challenges such as
extended time steps, low temporal information utilization, and the requirement
for consistent time step between testing and training. These challenges render
SNNs with high latency. Moreover, the constraint on time steps necessitates the
retraining of the model for new deployments, reducing adaptability. To address
these issues, this paper proposes a novel perspective, viewing the SNN as a
temporal aggregation model. We introduce the Temporal Knowledge Sharing (TKS)
method, facilitating information interact between different time points. TKS
can be perceived as a form of temporal self-distillation. To validate the
efficacy of TKS in information processing, we tested it on static datasets like
CIFAR10, CIFAR100, ImageNet-1k, and neuromorphic datasets such as DVS-CIFAR10
and NCALTECH101. Experimental results demonstrate that our method achieves
state-of-the-art performance compared to other algorithms. Furthermore, TKS
addresses the temporal consistency challenge, endowing the model with superior
temporal generalization capabilities. This allows the network to train with
longer time steps and maintain high performance during testing with shorter
time steps. Such an approach considerably accelerates the deployment of SNNs on
edge devices. Finally, we conducted ablation experiments and tested TKS on
fine-grained tasks, with results showcasing TKS's enhanced capability to
process information efficiently.
Related papers
- Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Temporal Contrastive Learning for Spiking Neural Networks [23.963069990569714]
Biologically inspired neural networks (SNNs) have garnered considerable attention due to their low-energy consumption and better-temporal information processing capabilities.
We propose a novel method to obtain SNNs with low latency and high performance by incorporating contrastive supervision with temporal domain information.
arXiv Detail & Related papers (2023-05-23T10:31:46Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Backpropagation with Biologically Plausible Spatio-Temporal Adjustment
For Training Deep Spiking Neural Networks [5.484391472233163]
The success of deep learning is inseparable from backpropagation.
We propose a biological plausible spatial adjustment, which rethinks the relationship between membrane potential and spikes.
Secondly, we propose a biologically plausible temporal adjustment making the error propagate across the spikes in the temporal dimension.
arXiv Detail & Related papers (2021-10-17T15:55:51Z) - A journey in ESN and LSTM visualisations on a language task [77.34726150561087]
We trained ESNs and LSTMs on a Cross-Situationnal Learning (CSL) task.
The results are of three kinds: performance comparison, internal dynamics analyses and visualization of latent space.
arXiv Detail & Related papers (2020-12-03T08:32:01Z) - Revisiting Batch Normalization for Training Low-latency Deep Spiking
Neural Networks from Scratch [5.511606249429581]
Spiking Neural Networks (SNNs) have emerged as an alternative to deep learning.
High-accuracy and low-latency SNNs from scratch suffer from non-differentiable nature of a spiking neuron.
We propose a temporal Batch Normalization Through Time (BNTT) technique for training temporal SNNs.
arXiv Detail & Related papers (2020-10-05T00:49:30Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.