Temporal Contrastive Learning for Spiking Neural Networks
- URL: http://arxiv.org/abs/2305.13909v1
- Date: Tue, 23 May 2023 10:31:46 GMT
- Title: Temporal Contrastive Learning for Spiking Neural Networks
- Authors: Haonan Qiu, Zeyin Song, Yanqi Chen, Munan Ning, Wei Fang, Tao Sun,
Zhengyu Ma, Li Yuan, and Yonghong Tian
- Abstract summary: Biologically inspired neural networks (SNNs) have garnered considerable attention due to their low-energy consumption and better-temporal information processing capabilities.
We propose a novel method to obtain SNNs with low latency and high performance by incorporating contrastive supervision with temporal domain information.
- Score: 23.963069990569714
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Biologically inspired spiking neural networks (SNNs) have garnered
considerable attention due to their low-energy consumption and spatio-temporal
information processing capabilities. Most existing SNNs training methods first
integrate output information across time steps, then adopt the cross-entropy
(CE) loss to supervise the prediction of the average representations. However,
in this work, we find the method above is not ideal for the SNNs training as it
omits the temporal dynamics of SNNs and degrades the performance quickly with
the decrease of inference time steps. One tempting method to model temporal
correlations is to apply the same label supervision at each time step and treat
them identically. Although it can acquire relatively consistent performance
across various time steps, it still faces challenges in obtaining SNNs with
high performance. Inspired by these observations, we propose Temporal-domain
supervised Contrastive Learning (TCL) framework, a novel method to obtain SNNs
with low latency and high performance by incorporating contrastive supervision
with temporal domain information. Contrastive learning (CL) prompts the network
to discern both consistency and variability in the representation space,
enabling it to better learn discriminative and generalizable features. We
extend this concept to the temporal domain of SNNs, allowing us to flexibly and
fully leverage the correlation between representations at different time steps.
Furthermore, we propose a Siamese Temporal-domain supervised Contrastive
Learning (STCL) framework to enhance the SNNs via augmentation, temporal and
class constraints simultaneously. Extensive experimental results demonstrate
that SNNs trained by our TCL and STCL can achieve both high performance and low
latency, achieving state-of-the-art performance on a variety of datasets (e.g.,
CIFAR-10, CIFAR-100, and DVS-CIFAR10).
Related papers
- Enhancing SNN-based Spatio-Temporal Learning: A Benchmark Dataset and Cross-Modality Attention Model [30.66645039322337]
High-quality benchmark datasets are great importance to the advances of Artificial Neural Networks (SNNs)
Yet, the SNN-based cross-modal fusion remains underexplored.
In this work, we present a neuromorphic dataset that can better exploit the inherent-temporal betemporal of SNNs.
arXiv Detail & Related papers (2024-10-21T06:59:04Z) - Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks [50.32980443749865]
Spiking neural networks (SNNs) have garnered significant attention for their low power consumption and high biologicalability.
Current SNNs struggle to balance accuracy and latency in neuromorphic datasets.
We propose Step-wise Distillation (HSD) method, tailored for neuromorphic datasets.
arXiv Detail & Related papers (2024-09-19T06:52:34Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - Temporal Knowledge Sharing enable Spiking Neural Network Learning from
Past and Future [7.300220260073691]
Spiking Neural Networks (SNNs) have attracted significant attention from researchers across various domains due to their brain-like information processing mechanism.
However, SNNs typically grapple with challenges such as extended time steps, low temporal information utilization, and the requirement for consistent time step between testing and training.
This paper proposes a novel perspective, viewing the SNN as a temporal aggregation model.
We introduce the Temporal Knowledge Sharing (TKS) method, facilitating information interact between different time points.
arXiv Detail & Related papers (2023-04-13T13:51:26Z) - Training Spiking Neural Networks with Local Tandem Learning [96.32026780517097]
Spiking neural networks (SNNs) are shown to be more biologically plausible and energy efficient than their predecessors.
In this paper, we put forward a generalized learning rule, termed Local Tandem Learning (LTL)
We demonstrate rapid network convergence within five training epochs on the CIFAR-10 dataset while having low computational complexity.
arXiv Detail & Related papers (2022-10-10T10:05:00Z) - Online Training Through Time for Spiking Neural Networks [66.7744060103562]
Spiking neural networks (SNNs) are promising brain-inspired energy-efficient models.
Recent progress in training methods has enabled successful deep SNNs on large-scale tasks with low latency.
We propose online training through time (OTTT) for SNNs, which is derived from BPTT to enable forward-in-time learning.
arXiv Detail & Related papers (2022-10-09T07:47:56Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.