Temporal Contrastive Learning for Spiking Neural Networks
- URL: http://arxiv.org/abs/2305.13909v1
- Date: Tue, 23 May 2023 10:31:46 GMT
- Title: Temporal Contrastive Learning for Spiking Neural Networks
- Authors: Haonan Qiu, Zeyin Song, Yanqi Chen, Munan Ning, Wei Fang, Tao Sun,
Zhengyu Ma, Li Yuan, and Yonghong Tian
- Abstract summary: Biologically inspired neural networks (SNNs) have garnered considerable attention due to their low-energy consumption and better-temporal information processing capabilities.
We propose a novel method to obtain SNNs with low latency and high performance by incorporating contrastive supervision with temporal domain information.
- Score: 23.963069990569714
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Biologically inspired spiking neural networks (SNNs) have garnered
considerable attention due to their low-energy consumption and spatio-temporal
information processing capabilities. Most existing SNNs training methods first
integrate output information across time steps, then adopt the cross-entropy
(CE) loss to supervise the prediction of the average representations. However,
in this work, we find the method above is not ideal for the SNNs training as it
omits the temporal dynamics of SNNs and degrades the performance quickly with
the decrease of inference time steps. One tempting method to model temporal
correlations is to apply the same label supervision at each time step and treat
them identically. Although it can acquire relatively consistent performance
across various time steps, it still faces challenges in obtaining SNNs with
high performance. Inspired by these observations, we propose Temporal-domain
supervised Contrastive Learning (TCL) framework, a novel method to obtain SNNs
with low latency and high performance by incorporating contrastive supervision
with temporal domain information. Contrastive learning (CL) prompts the network
to discern both consistency and variability in the representation space,
enabling it to better learn discriminative and generalizable features. We
extend this concept to the temporal domain of SNNs, allowing us to flexibly and
fully leverage the correlation between representations at different time steps.
Furthermore, we propose a Siamese Temporal-domain supervised Contrastive
Learning (STCL) framework to enhance the SNNs via augmentation, temporal and
class constraints simultaneously. Extensive experimental results demonstrate
that SNNs trained by our TCL and STCL can achieve both high performance and low
latency, achieving state-of-the-art performance on a variety of datasets (e.g.,
CIFAR-10, CIFAR-100, and DVS-CIFAR10).
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.