Neuromorphic Hebbian learning with magnetic tunnel junction synapses
- URL: http://arxiv.org/abs/2308.11011v1
- Date: Mon, 21 Aug 2023 19:58:44 GMT
- Title: Neuromorphic Hebbian learning with magnetic tunnel junction synapses
- Authors: Peng Zhou, Alexander J. Edwards, Frederick B. Mancoff, Sanjeev
Aggarwal, Stephen K. Heinrich-Barna, Joseph S. Friedman
- Abstract summary: We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
- Score: 41.92764939721262
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic computing aims to mimic both the function and structure of
biological neural networks to provide artificial intelligence with extreme
efficiency. Conventional approaches store synaptic weights in non-volatile
memory devices with analog resistance states, permitting in-memory computation
of neural network operations while avoiding the costs associated with
transferring synaptic weights from a memory array. However, the use of analog
resistance states for storing weights in neuromorphic systems is impeded by
stochastic writing, weights drifting over time through stochastic processes,
and limited endurance that reduces the precision of synapse weights. Here we
propose and experimentally demonstrate neuromorphic networks that provide
high-accuracy inference thanks to the binary resistance states of magnetic
tunnel junctions (MTJs), while leveraging the analog nature of their stochastic
spin-transfer torque (STT) switching for unsupervised Hebbian learning. We
performed the first experimental demonstration of a neuromorphic network
directly implemented with MTJ synapses, for both inference and
spike-timing-dependent plasticity learning. We also demonstrated through
simulation that the proposed system for unsupervised Hebbian learning with
stochastic STT-MTJ synapses can achieve competitive accuracies for MNIST
handwritten digit recognition. By appropriately applying neuromorphic
principles through hardware-aware design, the proposed STT-MTJ neuromorphic
learning networks provide a pathway toward artificial intelligence hardware
that learns autonomously with extreme efficiency.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Gradient-based Neuromorphic Learning on Dynamical RRAM Arrays [3.5969667977870796]
We present MEMprop, the adoption of gradient-based learning to train fully memristive spiking neural networks (MSNNs)
Our approach harnesses intrinsic device dynamics to trigger naturally arising voltage spikes.
We obtain highly competitive accuracy amongst previously reported lightweight dense fully MSNNs on several benchmarks.
arXiv Detail & Related papers (2022-06-26T23:13:34Z) - Synchronous Unsupervised STDP Learning with Stochastic STT-MRAM
Switching [3.2894781846488494]
The use of resistance states for storing weights in neuromorphic systems is impeded by fabrication imprecision and analogity.
This paper proposes a synchronous spiking network with clocked circuits that perform unsupervised learning system.
The proposed system enables a single-layer network to achieve 90% accuracy on the MNIST dataset.
arXiv Detail & Related papers (2021-12-10T17:59:46Z) - Experimental Demonstration of Neuromorphic Network with STT MTJ Synapses [58.40902139823252]
We present the first experimental demonstration of a neuromorphic network with magnetic tunnel junction (MTJ) synapses, which performs image recognition via vector-matrix multiplication.
We also simulate a large MTJ network performing MNIST handwritten digit recognition, demonstrating that MTJ crossbars can match memristor accuracy while providing increased precision, stability, and endurance.
arXiv Detail & Related papers (2021-12-09T08:11:47Z) - Shape-Dependent Multi-Weight Magnetic Artificial Synapses for
Neuromorphic Computing [4.567086462167893]
In neuromorphic computing, artificial synapses provide a multi-weight conductance state that is set based on inputs from neurons, analogous to the brain.
Here, we measure artificial synapses based on magnetic materials that use a magnetic tunnel junction and a magnetic domain wall.
arXiv Detail & Related papers (2021-11-22T20:27:14Z) - Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses [0.27998963147546135]
Mixed-signal neuromorphic processors could be used for inference and learning.
We show how inhomogeneous synaptic circuits could be utilized for resource-efficient implementation of network layers.
arXiv Detail & Related papers (2021-06-10T12:04:03Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.