Shape-Dependent Multi-Weight Magnetic Artificial Synapses for
Neuromorphic Computing
- URL: http://arxiv.org/abs/2111.11516v2
- Date: Thu, 17 Feb 2022 20:38:07 GMT
- Title: Shape-Dependent Multi-Weight Magnetic Artificial Synapses for
Neuromorphic Computing
- Authors: Thomas Leonard, Samuel Liu, Mahshid Alamdar, Can Cui, Otitoaleke G.
Akinola, Lin Xue, T. Patrick Xiao, Joseph S. Friedman, Matthew J. Marinella,
Christopher H. Bennett and Jean Anne C. Incorvia
- Abstract summary: In neuromorphic computing, artificial synapses provide a multi-weight conductance state that is set based on inputs from neurons, analogous to the brain.
Here, we measure artificial synapses based on magnetic materials that use a magnetic tunnel junction and a magnetic domain wall.
- Score: 4.567086462167893
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In neuromorphic computing, artificial synapses provide a multi-weight
conductance state that is set based on inputs from neurons, analogous to the
brain. Additional properties of the synapse beyond multiple weights can be
needed, and can depend on the application, requiring the need for generating
different synapse behaviors from the same materials. Here, we measure
artificial synapses based on magnetic materials that use a magnetic tunnel
junction and a magnetic domain wall. By fabricating lithographic notches in a
domain wall track underneath a single magnetic tunnel junction, we achieve 4-5
stable resistance states that can be repeatably controlled electrically using
spin orbit torque. We analyze the effect of geometry on the synapse behavior,
showing that a trapezoidal device has asymmetric weight updates with high
controllability, while a straight device has higher stochasticity, but with
stable resistance levels. The device data is input into neuromorphic computing
simulators to show the usefulness of application-specific synaptic functions.
Implementing an artificial neural network applied on streamed Fashion-MNIST
data, we show that the trapezoidal magnetic synapse can be used as a
metaplastic function for efficient online learning. Implementing a
convolutional neural network for CIFAR-100 image recognition, we show that the
straight magnetic synapse achieves near-ideal inference accuracy, due to the
stability of its resistance levels. This work shows multi-weight magnetic
synapses are a feasible technology for neuromorphic computing and provides
design guidelines for emerging artificial synapse technologies.
Related papers
- Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Neuromorphic Hebbian learning with magnetic tunnel junction synapses [41.92764939721262]
We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
arXiv Detail & Related papers (2023-08-21T19:58:44Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Machine learning using magnetic stochastic synapses [0.9236074230806579]
We present a methodology for exploiting the traditionally detrimental effects in magnetic domain-wall motion in nanowires.
We demonstrate functional binary synapses alongside a gradient learning rule that allows their training with applicability to a range of systems.
For single measurements, the rule results in binary synapses with minimal neuronality, sacrificing potential performance for robustness.
This observation allows us to choose design principles depending on the desired performance and the device's operational speed and energy cost.
arXiv Detail & Related papers (2023-03-03T12:33:29Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Neural Sampling Machine with Stochastic Synapse allows Brain-like
Learning and Inference [6.138129592577736]
We introduce a new class of NN called Neural-Sampling-Machine that exploits synapseity in synaptic connections for approximate Bayesian inference.
We experimentally show that the inherent switching of the selector element between the crossbar and metallic state introduces a multiplicative noise within the synapses of NSM.
We report a standard image classification task as well as estimation of data uncertainty in rotated samples.
arXiv Detail & Related papers (2021-02-20T23:45:24Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z) - Optimal Learning with Excitatory and Inhibitory synapses [91.3755431537592]
I study the problem of storing associations between analog signals in the presence of correlations.
I characterize the typical learning performance in terms of the power spectrum of random input and output processes.
arXiv Detail & Related papers (2020-05-25T18:25:54Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.