SpikePropamine: Differentiable Plasticity in Spiking Neural Networks
- URL: http://arxiv.org/abs/2106.02681v1
- Date: Fri, 4 Jun 2021 19:29:07 GMT
- Title: SpikePropamine: Differentiable Plasticity in Spiking Neural Networks
- Authors: Samuel Schmidgall, Julia Ashkanazy, Wallace Lawson, Joe Hays
- Abstract summary: We introduce a framework for learning the dynamics of synaptic plasticity and neuromodulated synaptic plasticity in Spiking Neural Networks (SNNs)
We show that SNNs augmented with differentiable plasticity are sufficient for solving a set of challenging temporal learning tasks.
These networks are also shown to be capable of producing locomotion on a high-dimensional robotic learning task.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The adaptive changes in synaptic efficacy that occur between spiking neurons
have been demonstrated to play a critical role in learning for biological
neural networks. Despite this source of inspiration, many learning focused
applications using Spiking Neural Networks (SNNs) retain static synaptic
connections, preventing additional learning after the initial training period.
Here, we introduce a framework for simultaneously learning the underlying
fixed-weights and the rules governing the dynamics of synaptic plasticity and
neuromodulated synaptic plasticity in SNNs through gradient descent. We further
demonstrate the capabilities of this framework on a series of challenging
benchmarks, learning the parameters of several plasticity rules including BCM,
Oja's, and their respective set of neuromodulatory variants. The experimental
results display that SNNs augmented with differentiable plasticity are
sufficient for solving a set of challenging temporal learning tasks that a
traditional SNN fails to solve, even in the presence of significant noise.
These networks are also shown to be capable of producing locomotion on a
high-dimensional robotic learning task, where near-minimal degradation in
performance is observed in the presence of novel conditions not seen during the
initial training period.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Spatio-temporal Structure of Excitation and Inhibition Emerges in Spiking Neural Networks with and without Biologically Plausible Constraints [0.06752396542927405]
We present a Spiking Neural Network (SNN) model that incorporates learnable synaptic delays.
We implement a dynamic pruning strategy that combines DEEP R for connection removal and RigL for connection.
We observed that the reintroduction-temporal patterns of excitation and inhibition appeared in the more biologically plausible model as well.
arXiv Detail & Related papers (2024-07-07T11:55:48Z) - Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Context Gating in Spiking Neural Networks: Achieving Lifelong Learning through Integration of Local and Global Plasticity [20.589970453110208]
Humans learn multiple tasks in succession with minimal mutual interference, through the context gating mechanism in the prefrontal cortex (PFC)
We propose SNN with context gating trained by the local plasticity rule (CG-SNN) for lifelong learning.
Experiments show that the proposed model is effective in maintaining the past learning experience and has better task-selectivity than other methods during lifelong learning.
arXiv Detail & Related papers (2024-06-04T01:35:35Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Modeling Associative Plasticity between Synapses to Enhance Learning of
Spiking Neural Networks [4.736525128377909]
Spiking Neural Networks (SNNs) are the third generation of artificial neural networks that enable energy-efficient implementation on neuromorphic hardware.
We propose a robust and effective learning mechanism by modeling the associative plasticity between synapses.
Our approaches achieve superior performance on static and state-of-the-art neuromorphic datasets.
arXiv Detail & Related papers (2022-07-24T06:12:23Z) - A Synapse-Threshold Synergistic Learning Approach for Spiking Neural
Networks [1.8556712517882232]
Spiking neural networks (SNNs) have demonstrated excellent capabilities in various intelligent scenarios.
In this study, we develop a novel synergistic learning approach that involves simultaneously training synaptic weights and spike thresholds in SNNs.
arXiv Detail & Related papers (2022-06-10T06:41:36Z) - A Spiking Neuron Synaptic Plasticity Model Optimized for Unsupervised
Learning [0.0]
Spiking neural networks (SNN) are considered as a perspective basis for performing all kinds of learning tasks - unsupervised, supervised and reinforcement learning.
Learning in SNN is implemented through synaptic plasticity - the rules which determine dynamics of synaptic weights depending usually on activity of the pre- and post-synaptic neurons.
arXiv Detail & Related papers (2021-11-12T15:26:52Z) - Dynamic Neural Diversification: Path to Computationally Sustainable
Neural Networks [68.8204255655161]
Small neural networks with a constrained number of trainable parameters, can be suitable resource-efficient candidates for many simple tasks.
We explore the diversity of the neurons within the hidden layer during the learning process.
We analyze how the diversity of the neurons affects predictions of the model.
arXiv Detail & Related papers (2021-09-20T15:12:16Z) - Artificial Neural Variability for Deep Learning: On Overfitting, Noise
Memorization, and Catastrophic Forgetting [135.0863818867184]
artificial neural variability (ANV) helps artificial neural networks learn some advantages from natural'' neural networks.
ANV plays as an implicit regularizer of the mutual information between the training data and the learned model.
It can effectively relieve overfitting, label noise memorization, and catastrophic forgetting at negligible costs.
arXiv Detail & Related papers (2020-11-12T06:06:33Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.