Voltage-Dependent Synaptic Plasticity (VDSP): Unsupervised probabilistic
Hebbian plasticity rule based on neurons membrane potential
- URL: http://arxiv.org/abs/2203.11022v1
- Date: Mon, 21 Mar 2022 14:39:02 GMT
- Title: Voltage-Dependent Synaptic Plasticity (VDSP): Unsupervised probabilistic
Hebbian plasticity rule based on neurons membrane potential
- Authors: Nikhil Garg, Ismael Balafrej, Terrence C. Stewart, Jean Michel Portal,
Marc Bocquet, Damien Querlioz, Dominique Drouin, Jean Rouat, Yann Beilliard,
Fabien Alibart
- Abstract summary: We propose a brain-inspired unsupervised local learning rule for the online implementation of Hebb's plasticity mechanism on neuromorphic hardware.
The proposed VDSP learning rule updates the synaptic conductance on the spike of the postsynaptic neuron only.
We report 85.01 $ pm $ 0.76% (Mean $ pm $ S.D.) accuracy for a network of 100 output neurons on the MNIST dataset.
- Score: 5.316910132506153
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study proposes voltage-dependent-synaptic plasticity (VDSP), a novel
brain-inspired unsupervised local learning rule for the online implementation
of Hebb's plasticity mechanism on neuromorphic hardware. The proposed VDSP
learning rule updates the synaptic conductance on the spike of the postsynaptic
neuron only, which reduces by a factor of two the number of updates with
respect to standard spike-timing-dependent plasticity (STDP). This update is
dependent on the membrane potential of the presynaptic neuron, which is readily
available as part of neuron implementation and hence does not require
additional memory for storage. Moreover, the update is also regularized on
synaptic weight and prevents explosion or vanishing of weights on repeated
stimulation. Rigorous mathematical analysis is performed to draw an equivalence
between VDSP and STDP. To validate the system-level performance of VDSP, we
train a single-layer spiking neural network (SNN) for the recognition of
handwritten digits. We report 85.01 $ \pm $ 0.76% (Mean $ \pm $ S.D.) accuracy
for a network of 100 output neurons on the MNIST dataset. The performance
improves when scaling the network size (89.93 $ \pm $ 0.41% for 400 output
neurons, 90.56 $ \pm $ 0.27 for 500 neurons), which validates the applicability
of the proposed learning rule for large-scale computer vision tasks.
Interestingly, the learning rule better adapts than STDP to the frequency of
input signal and does not require hand-tuning of hyperparameters.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Synchronized Stepwise Control of Firing and Learning Thresholds in a Spiking Randomly Connected Neural Network toward Hardware Implementation [0.0]
We propose hardware-oriented models of intrinsic plasticity (IP) and synaptic plasticity (SP) for spiking randomly connected neural network (RNN)
We demonstrate the effectiveness of our model through simulations of temporal data learning and anomaly detection with a spiking RNN using publicly available electrocardiograms.
arXiv Detail & Related papers (2024-04-26T08:26:10Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Synaptic Stripping: How Pruning Can Bring Dead Neurons Back To Life [0.0]
We introduce Synaptic Stripping as a means to combat the dead neuron problem.
By automatically removing problematic connections during training, we can regenerate dead neurons.
We conduct several ablation studies to investigate these dynamics as a function of network width and depth.
arXiv Detail & Related papers (2023-02-11T23:55:50Z) - Gradient-based Neuromorphic Learning on Dynamical RRAM Arrays [3.5969667977870796]
We present MEMprop, the adoption of gradient-based learning to train fully memristive spiking neural networks (MSNNs)
Our approach harnesses intrinsic device dynamics to trigger naturally arising voltage spikes.
We obtain highly competitive accuracy amongst previously reported lightweight dense fully MSNNs on several benchmarks.
arXiv Detail & Related papers (2022-06-26T23:13:34Z) - A Fully Memristive Spiking Neural Network with Unsupervised Learning [2.8971214387667494]
The system is fully memristive in that both neuronal and synaptic dynamics can be realized by using memristors.
The proposed MSNN implements STDP learning by using cumulative weight changes in memristive synapses from the voltage waveform changes across the synapses.
arXiv Detail & Related papers (2022-03-02T21:16:46Z) - Improving Spiking Neural Network Accuracy Using Time-based Neurons [0.24366811507669117]
Research on neuromorphic computing systems based on low-power spiking neural networks using analog neurons is in the spotlight.
As technology scales down, analog neurons are difficult to scale, and they suffer from reduced voltage headroom/dynamic range and circuit nonlinearities.
This paper first models the nonlinear behavior of existing current-mirror-based voltage-domain neurons designed in a 28nm process, and show SNN inference accuracy can be severely degraded by the effect of neuron's nonlinearity.
We propose a novel neuron, which processes incoming spikes in the time domain and greatly improves the linearity, thereby improving the inference accuracy compared to the
arXiv Detail & Related papers (2022-01-05T00:24:45Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.