Axonal Delay As a Short-Term Memory for Feed Forward Deep Spiking Neural
Networks
- URL: http://arxiv.org/abs/2205.02115v1
- Date: Wed, 20 Apr 2022 16:56:42 GMT
- Title: Axonal Delay As a Short-Term Memory for Feed Forward Deep Spiking Neural
Networks
- Authors: Pengfei Sun, Longwei Zhu and Dick Botteldooren
- Abstract summary: Recent studies have found that the time delay of neurons plays an important role in the learning process.
configuring the precise timing of the spike is a promising direction for understanding and improving the transmission process of temporal information in SNNs.
In this paper, we verify the effectiveness of integrating time delay into supervised learning and propose a module that modulates the axonal delay through short-term memory.
- Score: 3.985532502580783
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The information of spiking neural networks (SNNs) are propagated between the
adjacent biological neuron by spikes, which provides a computing paradigm with
the promise of simulating the human brain. Recent studies have found that the
time delay of neurons plays an important role in the learning process.
Therefore, configuring the precise timing of the spike is a promising direction
for understanding and improving the transmission process of temporal
information in SNNs. However, most of the existing learning methods for spiking
neurons are focusing on the adjustment of synaptic weight, while very few
research has been working on axonal delay. In this paper, we verify the
effectiveness of integrating time delay into supervised learning and propose a
module that modulates the axonal delay through short-term memory. To this end,
a rectified axonal delay (RAD) module is integrated with the spiking model to
align the spike timing and thus improve the characterization learning ability
of temporal features. Experiments on three neuromorphic benchmark datasets :
NMNIST, DVS Gesture and N-TIDIGITS18 show that the proposed method achieves the
state-of-the-art performance while using the fewest parameters.
Related papers
- Zero-Shot Temporal Resolution Domain Adaptation for Spiking Neural Networks [3.2366933261812076]
Spiking Neural Networks (SNNs) are biologically-inspired deep neural networks that efficiently extract temporal information.
SNN model parameters are sensitive to temporal resolution, leading to significant performance drops when the temporal resolution of target data at the edge is not the same.
We propose three novel domain adaptation methods for adapting neuron parameters to account for the change in time resolution without re-training on target time-resolution.
arXiv Detail & Related papers (2024-11-07T14:58:51Z) - Learning Delays Through Gradients and Structure: Emergence of Spatiotemporal Patterns in Spiking Neural Networks [0.06752396542927405]
We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays through two approaches.
In the latter approach, the network selects and prunes connections, optimizing the delays in sparse connectivity settings.
Our results demonstrate the potential of combining delay learning with dynamic pruning to develop efficient SNN models for temporal data processing.
arXiv Detail & Related papers (2024-07-07T11:55:48Z) - DelGrad: Exact gradients in spiking networks for learning transmission delays and weights [0.9411751957919126]
Spiking neural networks (SNNs) inherently rely on the timing of signals for representing and processing information.
Recent work has demonstrated the substantial advantages of learning these delays along with synaptic weights.
We propose an analytical approach for calculating exact loss gradients with respect to both synaptic weights and delays in an event-based fashion.
arXiv Detail & Related papers (2024-04-30T00:02:34Z) - Long Short-term Memory with Two-Compartment Spiking Neuron [64.02161577259426]
We propose a novel biologically inspired Long Short-Term Memory Leaky Integrate-and-Fire spiking neuron model, dubbed LSTM-LIF.
Our experimental results, on a diverse range of temporal classification tasks, demonstrate superior temporal classification capability, rapid training convergence, strong network generalizability, and high energy efficiency of the proposed LSTM-LIF model.
This work, therefore, opens up a myriad of opportunities for resolving challenging temporal processing tasks on emerging neuromorphic computing machines.
arXiv Detail & Related papers (2023-07-14T08:51:03Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Reducing Catastrophic Forgetting in Self Organizing Maps with
Internally-Induced Generative Replay [67.50637511633212]
A lifelong learning agent is able to continually learn from potentially infinite streams of pattern sensory data.
One major historic difficulty in building agents that adapt is that neural systems struggle to retain previously-acquired knowledge when learning from new samples.
This problem is known as catastrophic forgetting (interference) and remains an unsolved problem in the domain of machine learning to this day.
arXiv Detail & Related papers (2021-12-09T07:11:14Z) - Backpropagation with Biologically Plausible Spatio-Temporal Adjustment
For Training Deep Spiking Neural Networks [5.484391472233163]
The success of deep learning is inseparable from backpropagation.
We propose a biological plausible spatial adjustment, which rethinks the relationship between membrane potential and spikes.
Secondly, we propose a biologically plausible temporal adjustment making the error propagate across the spikes in the temporal dimension.
arXiv Detail & Related papers (2021-10-17T15:55:51Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Astrocytes mediate analogous memory in a multi-layer neuron-astrocytic
network [52.77024349608834]
We show how a piece of information can be maintained as a robust activity pattern for several seconds then completely disappear if no other stimuli come.
This kind of short-term memory can keep operative information for seconds, then completely forget it to avoid overlapping with forthcoming patterns.
We show how arbitrary patterns can be loaded, then stored for a certain interval of time, and retrieved if the appropriate clue pattern is applied to the input.
arXiv Detail & Related papers (2021-08-31T16:13:15Z) - Bio-plausible Unsupervised Delay Learning for Extracting Temporal
Features in Spiking Neural Networks [0.548253258922555]
plasticity of the conduction delay between neurons plays a fundamental role in learning.
Understanding the precise adjustment of synaptic delays could help us in developing effective brain-inspired computational models.
arXiv Detail & Related papers (2020-11-18T16:25:32Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.