SPICEprop: Backpropagating Errors Through Memristive Spiking Neural
Networks
- URL: http://arxiv.org/abs/2203.01426v1
- Date: Wed, 2 Mar 2022 21:34:43 GMT
- Title: SPICEprop: Backpropagating Errors Through Memristive Spiking Neural
Networks
- Authors: Peng Zhou, Jason K. Eshraghian, Dong-Uk Choi, Sung-Mo Kang
- Abstract summary: We present a fully memristive spiking neural network (MSNN) consisting of novel memristive neurons trained using the backpropagation through time (BPTT) learning rule.
Gradient descent is applied directly to the memristive integrated-and-fire (MIF) neuron designed using analog SPICE circuit models.
We achieve 97.58% accuracy on the MNIST testing dataset and 75.26% on the Fashion-MNIST testing dataset, the highest accuracies among all fully MSNNs.
- Score: 2.8971214387667494
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a fully memristive spiking neural network (MSNN) consisting of
novel memristive neurons trained using the backpropagation through time (BPTT)
learning rule. Gradient descent is applied directly to the memristive
integrated-and-fire (MIF) neuron designed using analog SPICE circuit models,
which generates distinct depolarization, hyperpolarization, and repolarization
voltage waveforms. Synaptic weights are trained by BPTT using the membrane
potential of the MIF neuron model and can be processed on memristive crossbars.
The natural spiking dynamics of the MIF neuron model and fully differentiable,
eliminating the need for gradient approximations that are prevalent in the
spiking neural network literature. Despite the added complexity of training
directly on SPICE circuit models, we achieve 97.58% accuracy on the MNIST
testing dataset and 75.26% on the Fashion-MNIST testing dataset, the highest
accuracies among all fully MSNNs.
Related papers
- MC-QDSNN: Quantized Deep evolutionary SNN with Multi-Dendritic Compartment Neurons for Stress Detection using Physiological Signals [1.474723404975345]
This work proposes Multi-Compartment Leaky (MCLeaky) neuron as a viable alternative for efficient processing of time series data.
The proposed MCLeaky neuron based Spiking Neural Network model and its quantized variant were benchmarked against state-of-the-art (SOTA) Spiking LSTMs.
Results show that networks with MCLeaky activation neuron managed a superior accuracy of 98.8% to detect stress.
arXiv Detail & Related papers (2024-10-07T12:48:03Z) - Evolutionary algorithms as an alternative to backpropagation for
supervised training of Biophysical Neural Networks and Neural ODEs [12.357635939839696]
We investigate the use of "gradient-estimating" evolutionary algorithms for training biophysically based neural networks.
We find that EAs have several advantages making them desirable over direct BP.
Our findings suggest that biophysical neurons could provide useful benchmarks for testing the limits of BP methods.
arXiv Detail & Related papers (2023-11-17T20:59:57Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - Speed Limits for Deep Learning [67.69149326107103]
Recent advancement in thermodynamics allows bounding the speed at which one can go from the initial weight distribution to the final distribution of the fully trained network.
We provide analytical expressions for these speed limits for linear and linearizable neural networks.
Remarkably, given some plausible scaling assumptions on the NTK spectra and spectral decomposition of the labels -- learning is optimal in a scaling sense.
arXiv Detail & Related papers (2023-07-27T06:59:46Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Continuous time recurrent neural networks: overview and application to
forecasting blood glucose in the intensive care unit [56.801856519460465]
Continuous time autoregressive recurrent neural networks (CTRNNs) are a deep learning model that account for irregular observations.
We demonstrate the application of these models to probabilistic forecasting of blood glucose in a critical care setting.
arXiv Detail & Related papers (2023-04-14T09:39:06Z) - Gradient-based Neuromorphic Learning on Dynamical RRAM Arrays [3.5969667977870796]
We present MEMprop, the adoption of gradient-based learning to train fully memristive spiking neural networks (MSNNs)
Our approach harnesses intrinsic device dynamics to trigger naturally arising voltage spikes.
We obtain highly competitive accuracy amongst previously reported lightweight dense fully MSNNs on several benchmarks.
arXiv Detail & Related papers (2022-06-26T23:13:34Z) - Voltage-Dependent Synaptic Plasticity (VDSP): Unsupervised probabilistic
Hebbian plasticity rule based on neurons membrane potential [5.316910132506153]
We propose a brain-inspired unsupervised local learning rule for the online implementation of Hebb's plasticity mechanism on neuromorphic hardware.
The proposed VDSP learning rule updates the synaptic conductance on the spike of the postsynaptic neuron only.
We report 85.01 $ pm $ 0.76% (Mean $ pm $ S.D.) accuracy for a network of 100 output neurons on the MNIST dataset.
arXiv Detail & Related papers (2022-03-21T14:39:02Z) - Enhanced physics-constrained deep neural networks for modeling vanadium
redox flow battery [62.997667081978825]
We propose an enhanced version of the physics-constrained deep neural network (PCDNN) approach to provide high-accuracy voltage predictions.
The ePCDNN can accurately capture the voltage response throughout the charge--discharge cycle, including the tail region of the voltage discharge curve.
arXiv Detail & Related papers (2022-03-03T19:56:24Z) - A Fully Memristive Spiking Neural Network with Unsupervised Learning [2.8971214387667494]
The system is fully memristive in that both neuronal and synaptic dynamics can be realized by using memristors.
The proposed MSNN implements STDP learning by using cumulative weight changes in memristive synapses from the voltage waveform changes across the synapses.
arXiv Detail & Related papers (2022-03-02T21:16:46Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.