ASRC-SNN: Adaptive Skip Recurrent Connection Spiking Neural Network
- URL: http://arxiv.org/abs/2505.11455v1
- Date: Fri, 16 May 2025 17:10:11 GMT
- Title: ASRC-SNN: Adaptive Skip Recurrent Connection Spiking Neural Network
- Authors: Shang Xu, Jiayu Zhang, Ziming Wang, Runhao Jiang, Rui Yan, Huajin Tang,
- Abstract summary: Recurrent Spiking Neural Networks (RSNNs) have shown promising potential in long-term temporal modeling.<n>We propose the Skip Recurrent Connection (SRC) as a replacement for the vanilla recurrent structure.<n>ASRC-SNN outperforms SRC-SNN in terms of temporal modeling capabilities and robustness.
- Score: 29.337015135181083
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, Recurrent Spiking Neural Networks (RSNNs) have shown promising potential in long-term temporal modeling. Many studies focus on improving neuron models and also integrate recurrent structures, leveraging their synergistic effects to improve the long-term temporal modeling capabilities of Spiking Neural Networks (SNNs). However, these studies often place an excessive emphasis on the role of neurons, overlooking the importance of analyzing neurons and recurrent structures as an integrated framework. In this work, we consider neurons and recurrent structures as an integrated system and conduct a systematic analysis of gradient propagation along the temporal dimension, revealing a challenging gradient vanishing problem. To address this issue, we propose the Skip Recurrent Connection (SRC) as a replacement for the vanilla recurrent structure, effectively mitigating the gradient vanishing problem and enhancing long-term temporal modeling performance. Additionally, we propose the Adaptive Skip Recurrent Connection (ASRC), a method that can learn the skip span of skip recurrent connection in each layer of the network. Experiments show that replacing the vanilla recurrent structure in RSNN with SRC significantly improves the model's performance on temporal benchmark datasets. Moreover, ASRC-SNN outperforms SRC-SNN in terms of temporal modeling capabilities and robustness.
Related papers
- Spiking Neural Networks for Temporal Processing: Status Quo and Future Prospects [41.8742357294068]
Spiking Neural Networks (SNNs) excel in handling data with high efficiency, owing to their rich neuronal dynamics and sparse activity patterns.<n>Given the recent surge in the development of SNNs, there is an urgent need for a comprehensive evaluation of their temporal processing capabilities.
arXiv Detail & Related papers (2025-02-13T16:17:57Z) - Deep-Unrolling Multidimensional Harmonic Retrieval Algorithms on Neuromorphic Hardware [78.17783007774295]
This paper explores the potential of conversion-based neuromorphic algorithms for highly accurate and energy-efficient single-snapshot multidimensional harmonic retrieval.<n>A novel method for converting the complex-valued convolutional layers and activations into spiking neural networks (SNNs) is developed.<n>The converted SNNs achieve almost five-fold power efficiency at moderate performance loss compared to the original CNNs.
arXiv Detail & Related papers (2024-12-05T09:41:33Z) - Learning Delays Through Gradients and Structure: Emergence of Spatiotemporal Patterns in Spiking Neural Networks [0.06752396542927405]
We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches.
In the latter approach, the network selects and prunes connections, optimizing the delays in sparse connectivity settings.
Our results demonstrate the potential of combining delay learning with dynamic pruning to develop efficient SNN models for temporal data processing.
arXiv Detail & Related papers (2024-07-07T11:55:48Z) - P-SpikeSSM: Harnessing Probabilistic Spiking State Space Models for Long-Range Dependency Tasks [1.9775291915550175]
Spiking neural networks (SNNs) are posited as a computationally efficient and biologically plausible alternative to conventional neural architectures.<n>We develop a scalable probabilistic spiking learning framework for long-range dependency tasks.<n>Our models attain state-of-the-art performance among SNN models across diverse long-range dependency tasks.
arXiv Detail & Related papers (2024-06-05T04:23:11Z) - Autaptic Synaptic Circuit Enhances Spatio-temporal Predictive Learning of Spiking Neural Networks [23.613277062707844]
Spiking Neural Networks (SNNs) emulate the integrated-fire-leak mechanism found in biological neurons.
Existing SNNs predominantly rely on the Integrate-and-Fire Leaky (LIF) model.
This paper proposes a novel S-patioTemporal Circuit (STC) model.
arXiv Detail & Related papers (2024-06-01T11:17:27Z) - Advancing Spiking Neural Networks towards Multiscale Spatiotemporal Interaction Learning [10.702093960098106]
Spiking Neural Networks (SNNs) serve as an energy-efficient alternative to Artificial Neural Networks (ANNs)
We have designed a Spiking Multiscale Attention (SMA) module that captures multiscaletemporal interaction information.
Our approach has achieved state-of-the-art results on mainstream neural datasets.
arXiv Detail & Related papers (2024-05-22T14:16:05Z) - Unleashing the Potential of Spiking Neural Networks for Sequential
Modeling with Contextual Embedding [32.25788551849627]
Brain-inspired spiking neural networks (SNNs) have struggled to match their biological counterpart in modeling long-term temporal relationships.
This paper presents a novel Contextual Embedding Leaky Integrate-and-Fire (CE-LIF) spiking neuron model.
arXiv Detail & Related papers (2023-08-29T09:33:10Z) - TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential
Modelling [54.97005925277638]
The identification of sensory cues associated with potential opportunities and dangers is frequently complicated by unrelated events that separate useful cues by long delays.
It remains a challenging task for state-of-the-art spiking neural networks (SNNs) to establish long-term temporal dependency between distant cues.
We propose a novel biologically inspired Two-Compartment Leaky Integrate-and-Fire spiking neuron model, dubbed TC-LIF.
arXiv Detail & Related papers (2023-08-25T08:54:41Z) - Long Short-term Memory with Two-Compartment Spiking Neuron [64.02161577259426]
We propose a novel biologically inspired Long Short-Term Memory Leaky Integrate-and-Fire spiking neuron model, dubbed LSTM-LIF.
Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, strong network generalizability, and high energy efficiency of the proposed LSTM-LIF model.
This work, therefore, opens up a myriad of opportunities for resolving challenging temporal processing tasks on emerging neuromorphic computing machines.
arXiv Detail & Related papers (2023-07-14T08:51:03Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z) - Exploring weight initialization, diversity of solutions, and degradation
in recurrent neural networks trained for temporal and decision-making tasks [0.0]
Recurrent Neural Networks (RNNs) are frequently used to model aspects of brain function and structure.
In this work, we trained small fully-connected RNNs to perform temporal and flow control tasks with time-varying stimuli.
arXiv Detail & Related papers (2019-06-03T21:56:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.