Stochastic Domain Wall-Magnetic Tunnel Junction Artificial Neurons for
Noise-Resilient Spiking Neural Networks
- URL: http://arxiv.org/abs/2304.04794v1
- Date: Mon, 10 Apr 2023 18:00:26 GMT
- Title: Stochastic Domain Wall-Magnetic Tunnel Junction Artificial Neurons for
Noise-Resilient Spiking Neural Networks
- Authors: Thomas Leonard, Samuel Liu, Harrison Jin, and Jean Anne C. Incorvia
- Abstract summary: We present a scaled DW-MTJ neuron with voltage-dependent probability firing.
validation accuracy during training was also shown to be comparable to an ideal integrate and fire device.
This work shows that DW-MTJ devices can be used to construct noise-resilient networks suitable for neuromorphic computing on the edge.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The spatiotemporal nature of neuronal behavior in spiking neural networks
(SNNs) make SNNs promising for edge applications that require high energy
efficiency. To realize SNNs in hardware, spintronic neuron implementations can
bring advantages of scalability and energy efficiency. Domain wall (DW) based
magnetic tunnel junction (MTJ) devices are well suited for probabilistic neural
networks given their intrinsic integrate-and-fire behavior with tunable
stochasticity. Here, we present a scaled DW-MTJ neuron with voltage-dependent
firing probability. The measured behavior was used to simulate a SNN that
attains accuracy during learning compared to an equivalent, but more
complicated, multi-weight (MW) DW-MTJ device. The validation accuracy during
training was also shown to be comparable to an ideal leaky integrate and fire
(LIF) device. However, during inference, the binary DW-MTJ neuron outperformed
the other devices after gaussian noise was introduced to the Fashion-MNIST
classification task. This work shows that DW-MTJ devices can be used to
construct noise-resilient networks suitable for neuromorphic computing on the
edge.
Related papers
- Neuromorphic Wireless Split Computing with Multi-Level Spikes [69.73249913506042]
In neuromorphic computing, spiking neural networks (SNNs) perform inference tasks, offering significant efficiency gains for workloads involving sequential data.
Recent advances in hardware and software have demonstrated that embedding a few bits of payload in each spike exchanged between the spiking neurons can further enhance inference accuracy.
This paper investigates a wireless neuromorphic split computing architecture employing multi-level SNNs.
arXiv Detail & Related papers (2024-11-07T14:08:35Z) - Scalable Mechanistic Neural Networks [52.28945097811129]
We propose an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences.
By reformulating the original Mechanistic Neural Network (MNN) we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear.
Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources.
arXiv Detail & Related papers (2024-10-08T14:27:28Z) - Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Neuromorphic Hebbian learning with magnetic tunnel junction synapses [41.92764939721262]
We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
arXiv Detail & Related papers (2023-08-21T19:58:44Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Energy Efficient Learning with Low Resolution Stochastic Domain Wall
Synapse Based Deep Neural Networks [0.9176056742068814]
We demonstrate that extremely low resolution quantized (nominally 5-state) synapses with large variations in Domain Wall (DW) position can be both energy efficient and achieve reasonably high testing accuracies.
We show that by implementing suitable modifications to the learning algorithms, we can address the behavior as well as the effect of their low-resolution to achieve high testing accuracies.
arXiv Detail & Related papers (2021-11-14T09:12:29Z) - Controllable reset behavior in domain wall-magnetic tunnel junction
artificial neurons for task-adaptable computation [1.4505273244528207]
Domain wall-magnetic tunnel junction (DW-MTJ) devices have been shown to be able to intrinsically capture biological neuron behavior.
We show that edgy-relaxed behavior can be implemented in DW-MTJ artificial neurons via three alternative mechanisms.
arXiv Detail & Related papers (2021-01-08T16:50:29Z) - Compiling Spiking Neural Networks to Mitigate Neuromorphic Hardware
Constraints [0.30458514384586394]
Spiking Neural Networks (SNNs) are efficient of computation-constrained pattern recognition on resource- and power-constrained platforms.
SNNs executed on neuromorphic hardware can further reduce energy consumption of these platforms.
arXiv Detail & Related papers (2020-11-27T19:10:23Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Training of Quantized Deep Neural Networks using a Magnetic Tunnel
Junction-Based Synapse [23.08163992580639]
Quantized neural networks (QNNs) are being actively researched as a solution for the computational complexity and memory intensity of deep neural networks.
We show how magnetic tunnel junction (MTJ) devices can be used to support QNN training.
We introduce a novel synapse circuit that uses the MTJ behavior to support the quantize update.
arXiv Detail & Related papers (2019-12-29T11:36:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.