Intrinsic Spike Timing Dependent Plasticity in Stochastic Magnetic
Tunnel Junctions Mediated by Heat Dynamics
- URL: http://arxiv.org/abs/2108.12684v1
- Date: Sat, 28 Aug 2021 18:02:01 GMT
- Title: Intrinsic Spike Timing Dependent Plasticity in Stochastic Magnetic
Tunnel Junctions Mediated by Heat Dynamics
- Authors: Humberto Inzunza Velarde, Jheel Nagaria, Zihan Yin, Ajey Jacob,
Akhilesh Jaiswal
- Abstract summary: Neuromorphic computing aims to mimic the behavior of biological neurons and synapses using solid-state devices and circuits.
We propose a method to implement the Spike Timing Dependent Plasticity (STDP) behavior of biological synapses in Magnetic Tunnel Junction (MTJ) devices.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The quest for highly efficient cognitive computing has led to extensive
research interest for the field of neuromorphic computing. Neuromorphic
computing aims to mimic the behavior of biological neurons and synapses using
solid-state devices and circuits. Among various approaches, emerging
non-volatile memory technologies are of special interest for mimicking
neuro-synaptic behavior. These devices allow the mapping of the rich dynamics
of biological neurons and synapses onto their intrinsic device physics. In this
letter, we focus on Spike Timing Dependent Plasticity (STDP) behavior of
biological synapses and propose a method to implement the STDP behavior in
Magnetic Tunnel Junction (MTJ) devices. Specifically, we exploit the
time-dependent heat dynamics and the response of an MTJ to the instantaneous
temperature to imitate the STDP behavior. Our simulations, based on a
macro-spin model for magnetization dynamics, show that, STDP can be imitated in
stochastic magnetic tunnel junctions by applying simple voltage waveforms as
the spiking response of pre- and post-neurons across an MTJ device.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Neuromorphic Hebbian learning with magnetic tunnel junction synapses [41.92764939721262]
We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
arXiv Detail & Related papers (2023-08-21T19:58:44Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - DriPP: Driven Point Processes to Model Stimuli Induced Patterns in M/EEG
Signals [62.997667081978825]
We develop a novel statistical point process model-called driven temporal point processes (DriPP)
We derive a fast and principled expectation-maximization (EM) algorithm to estimate the parameters of this model.
Results on standard MEG datasets demonstrate that our methodology reveals event-related neural responses.
arXiv Detail & Related papers (2021-12-08T13:07:21Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses [0.27998963147546135]
Mixed-signal neuromorphic processors could be used for inference and learning.
We show how inhomogeneous synaptic circuits could be utilized for resource-efficient implementation of network layers.
arXiv Detail & Related papers (2021-06-10T12:04:03Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.