A Fully Memristive Spiking Neural Network with Unsupervised Learning
- URL: http://arxiv.org/abs/2203.01416v1
- Date: Wed, 2 Mar 2022 21:16:46 GMT
- Title: A Fully Memristive Spiking Neural Network with Unsupervised Learning
- Authors: Peng Zhou, Dong-Uk Choi, Jason K. Eshraghian, Sung-Mo Kang
- Abstract summary: The system is fully memristive in that both neuronal and synaptic dynamics can be realized by using memristors.
The proposed MSNN implements STDP learning by using cumulative weight changes in memristive synapses from the voltage waveform changes across the synapses.
- Score: 2.8971214387667494
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a fully memristive spiking neural network (MSNN) consisting of
physically-realizable memristive neurons and memristive synapses to implement
an unsupervised Spiking Time Dependent Plasticity (STDP) learning rule. The
system is fully memristive in that both neuronal and synaptic dynamics can be
realized by using memristors. The neuron is implemented using the SPICE-level
memristive integrate-and-fire (MIF) model, which consists of a minimal number
of circuit elements necessary to achieve distinct depolarization,
hyperpolarization, and repolarization voltage waveforms. The proposed MSNN
uniquely implements STDP learning by using cumulative weight changes in
memristive synapses from the voltage waveform changes across the synapses,
which arise from the presynaptic and postsynaptic spiking voltage signals
during the training process. Two types of MSNN architectures are investigated:
1) a biologically plausible memory retrieval system, and 2) a multi-class
classification system. Our circuit simulation results verify the MSNN's
unsupervised learning efficacy by replicating biological memory retrieval
mechanisms, and achieving 97.5% accuracy in a 4-pattern recognition problem in
a large scale discriminative MSNN.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Neuromorphic Hebbian learning with magnetic tunnel junction synapses [41.92764939721262]
We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
arXiv Detail & Related papers (2023-08-21T19:58:44Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Gradient-based Neuromorphic Learning on Dynamical RRAM Arrays [3.5969667977870796]
We present MEMprop, the adoption of gradient-based learning to train fully memristive spiking neural networks (MSNNs)
Our approach harnesses intrinsic device dynamics to trigger naturally arising voltage spikes.
We obtain highly competitive accuracy amongst previously reported lightweight dense fully MSNNs on several benchmarks.
arXiv Detail & Related papers (2022-06-26T23:13:34Z) - Voltage-Dependent Synaptic Plasticity (VDSP): Unsupervised probabilistic
Hebbian plasticity rule based on neurons membrane potential [5.316910132506153]
We propose a brain-inspired unsupervised local learning rule for the online implementation of Hebb's plasticity mechanism on neuromorphic hardware.
The proposed VDSP learning rule updates the synaptic conductance on the spike of the postsynaptic neuron only.
We report 85.01 $ pm $ 0.76% (Mean $ pm $ S.D.) accuracy for a network of 100 output neurons on the MNIST dataset.
arXiv Detail & Related papers (2022-03-21T14:39:02Z) - SPICEprop: Backpropagating Errors Through Memristive Spiking Neural
Networks [2.8971214387667494]
We present a fully memristive spiking neural network (MSNN) consisting of novel memristive neurons trained using the backpropagation through time (BPTT) learning rule.
Gradient descent is applied directly to the memristive integrated-and-fire (MIF) neuron designed using analog SPICE circuit models.
We achieve 97.58% accuracy on the MNIST testing dataset and 75.26% on the Fashion-MNIST testing dataset, the highest accuracies among all fully MSNNs.
arXiv Detail & Related papers (2022-03-02T21:34:43Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.