Learning in Spiking Neural Networks with a Calcium-based Hebbian Rule for Spike-timing-dependent Plasticity
- URL: http://arxiv.org/abs/2504.06796v1
- Date: Wed, 09 Apr 2025 11:39:59 GMT
- Title: Learning in Spiking Neural Networks with a Calcium-based Hebbian Rule for Spike-timing-dependent Plasticity
- Authors: Willian Soares GirĂ£o, Nicoletta Risi, Elisabetta Chicca,
- Abstract summary: We present a Hebbian local learning rule that models synaptic modification as a function of calcium traces tracking neuronal activity.<n>We show how our model is sensitive to correlated spiking activity and how this enables it to modulate the learning rate of the network without altering the mean firing rate of the neurons.
- Score: 0.46085106405479537
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Understanding how biological neural networks are shaped via local plasticity mechanisms can lead to energy-efficient and self-adaptive information processing systems, which promises to mitigate some of the current roadblocks in edge computing systems. While biology makes use of spikes to seamless use both spike timing and mean firing rate to modulate synaptic strength, most models focus on one of the two. In this work, we present a Hebbian local learning rule that models synaptic modification as a function of calcium traces tracking neuronal activity. We show how the rule reproduces results from spike time and spike rate protocols from neuroscientific studies. Moreover, we use the model to train spiking neural networks on MNIST digit recognition to show and explain what sort of mechanisms are needed to learn real-world patterns. We show how our model is sensitive to correlated spiking activity and how this enables it to modulate the learning rate of the network without altering the mean firing rate of the neurons nor the hyparameters of the learning rule. To the best of our knowledge, this is the first work that showcases how spike timing and rate can be complementary in their role of shaping the connectivity of spiking neural networks.
Related papers
- Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.<n>In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Characterizing Learning in Spiking Neural Networks with Astrocyte-Like Units [0.0]
We introduce a modified spiking neural network model with added astrocyte-like units in a neural network.<n>We show that the combination of neurons and astrocytes together, as opposed to neural- and astrocyte-only networks, are critical for driving learning.
arXiv Detail & Related papers (2025-03-09T22:36:58Z) - Neural Models of Task Adaptation: A Tutorial on Spiking Networks for Executive Control [0.0]
This tutorial presents a step-by-step approach to constructing a spiking neural network (SNN) that simulates task-switching dynamics.<n>The model incorporates biologically realistic features, including lateral inhibition, adaptive synaptic weights, and precise parameterization within physiologically relevant ranges.<n>By following this tutorial, researchers can develop and extend biologically inspired SNN models for studying cognitive processes and neural adaptation.
arXiv Detail & Related papers (2025-03-05T00:44:34Z) - Unsupervised representation learning with Hebbian synaptic and structural plasticity in brain-like feedforward neural networks [0.0]
We introduce and evaluate a brain-like neural network model capable of unsupervised representation learning.
The model was tested on a diverse set of popular machine learning benchmarks.
arXiv Detail & Related papers (2024-06-07T08:32:30Z) - Temporal Conditioning Spiking Latent Variable Models of the Neural
Response to Natural Visual Scenes [29.592870472342337]
This work presents the temporal conditioning spiking latent variable models (TeCoS-LVM) to simulate the neural response to natural visual stimuli.
We use spiking neurons to produce spike outputs that directly match the recorded trains.
We show that TeCoS-LVM models can produce more realistic spike activities and accurately fit spike statistics than powerful alternatives.
arXiv Detail & Related papers (2023-06-21T06:30:18Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Unsupervised Spiking Neural Network Model of Prefrontal Cortex to study
Task Switching with Synaptic deficiency [0.0]
We build a computational model of Prefrontal Cortex (PFC) using Spiking Neural Networks (SNN)
In this study, we use SNN's having parameters close to biologically plausible values and train the model using unsupervised Spike Timing Dependent Plasticity (STDP) learning rule.
arXiv Detail & Related papers (2023-05-23T05:59:54Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Training Deep Spiking Auto-encoders without Bursting or Dying Neurons
through Regularization [9.34612743192798]
Spiking neural networks are a promising approach towards next-generation models of the brain in computational neuroscience.
We apply end-to-end learning with membrane potential-based backpropagation to a spiking convolutional auto-encoder.
We show that applying regularization on membrane potential and spiking output successfully avoids both dead and bursting neurons.
arXiv Detail & Related papers (2021-09-22T21:27:40Z) - Towards Efficient Processing and Learning with Spikes: New Approaches
for Multi-Spike Learning [59.249322621035056]
We propose two new multi-spike learning rules which demonstrate better performance over other baselines on various tasks.
In the feature detection task, we re-examine the ability of unsupervised STDP with its limitations being presented.
Our proposed learning rules can reliably solve the task over a wide range of conditions without specific constraints being applied.
arXiv Detail & Related papers (2020-05-02T06:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.