Machine learning using magnetic stochastic synapses
- URL: http://arxiv.org/abs/2303.01886v1
- Date: Fri, 3 Mar 2023 12:33:29 GMT
- Title: Machine learning using magnetic stochastic synapses
- Authors: Matthew O. A. Ellis, Alex Welbourne, Stephan J. Kyle, Paul W. Fry, Dan
A. Allwood, Thomas J. Hayward and Eleni Vasilaki
- Abstract summary: We present a methodology for exploiting the traditionally detrimental effects in magnetic domain-wall motion in nanowires.
We demonstrate functional binary synapses alongside a gradient learning rule that allows their training with applicability to a range of systems.
For single measurements, the rule results in binary synapses with minimal neuronality, sacrificing potential performance for robustness.
This observation allows us to choose design principles depending on the desired performance and the device's operational speed and energy cost.
- Score: 0.9236074230806579
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The impressive performance of artificial neural networks has come at the cost
of high energy usage and CO$_2$ emissions. Unconventional computing
architectures, with magnetic systems as a candidate, have potential as
alternative energy-efficient hardware, but, still face challenges, such as
stochastic behaviour, in implementation. Here, we present a methodology for
exploiting the traditionally detrimental stochastic effects in magnetic
domain-wall motion in nanowires. We demonstrate functional binary stochastic
synapses alongside a gradient learning rule that allows their training with
applicability to a range of stochastic systems. The rule, utilising the mean
and variance of the neuronal output distribution, finds a trade-off between
synaptic stochasticity and energy efficiency depending on the number of
measurements of each synapse. For single measurements, the rule results in
binary synapses with minimal stochasticity, sacrificing potential performance
for robustness. For multiple measurements, synaptic distributions are broad,
approximating better-performing continuous synapses. This observation allows us
to choose design principles depending on the desired performance and the
device's operational speed and energy cost. We verify performance on physical
hardware, showing it is comparable to a standard neural network.
Related papers
- Single Neuromorphic Memristor closely Emulates Multiple Synaptic
Mechanisms for Energy Efficient Neural Networks [71.79257685917058]
We demonstrate memristive nano-devices based on SrTiO3 that inherently emulate all these synaptic functions.
These memristors operate in a non-filamentary, low conductance regime, which enables stable and energy efficient operation.
arXiv Detail & Related papers (2024-02-26T15:01:54Z) - Learning with Chemical versus Electrical Synapses -- Does it Make a
Difference? [61.85704286298537]
Bio-inspired neural networks have the potential to advance our understanding of neural computation and improve the state-of-the-art of AI systems.
We conduct experiments with autonomous lane-keeping through a photorealistic autonomous driving simulator to evaluate their performance under diverse conditions.
arXiv Detail & Related papers (2023-11-21T13:07:20Z) - Neuromorphic Hebbian learning with magnetic tunnel junction synapses [41.92764939721262]
We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
arXiv Detail & Related papers (2023-08-21T19:58:44Z) - Shape-Dependent Multi-Weight Magnetic Artificial Synapses for
Neuromorphic Computing [4.567086462167893]
In neuromorphic computing, artificial synapses provide a multi-weight conductance state that is set based on inputs from neurons, analogous to the brain.
Here, we measure artificial synapses based on magnetic materials that use a magnetic tunnel junction and a magnetic domain wall.
arXiv Detail & Related papers (2021-11-22T20:27:14Z) - Optimal input representation in neural systems at the edge of chaos [0.0]
We build an artificial neural network and train it to classify images.
We find that the best performance in such a task is obtained when the network operates near the critical point.
We conclude that operating near criticality can have -- besides the usually alleged virtues -- the advantage of allowing for flexible, robust and efficient input representations.
arXiv Detail & Related papers (2021-07-12T19:55:03Z) - Neural Sampling Machine with Stochastic Synapse allows Brain-like
Learning and Inference [6.138129592577736]
We introduce a new class of NN called Neural-Sampling-Machine that exploits synapseity in synaptic connections for approximate Bayesian inference.
We experimentally show that the inherent switching of the selector element between the crossbar and metallic state introduces a multiplicative noise within the synapses of NSM.
We report a standard image classification task as well as estimation of data uncertainty in rotated samples.
arXiv Detail & Related papers (2021-02-20T23:45:24Z) - And/or trade-off in artificial neurons: impact on adversarial robustness [91.3755431537592]
Presence of sufficient number of OR-like neurons in a network can lead to classification brittleness and increased vulnerability to adversarial attacks.
We define AND-like neurons and propose measures to increase their proportion in the network.
Experimental results on the MNIST dataset suggest that our approach holds promise as a direction for further exploration.
arXiv Detail & Related papers (2021-02-15T08:19:05Z) - Variational Monte Carlo calculations of $\mathbf{A\leq 4}$ nuclei with
an artificial neural-network correlator ansatz [62.997667081978825]
We introduce a neural-network quantum state ansatz to model the ground-state wave function of light nuclei.
We compute the binding energies and point-nucleon densities of $Aleq 4$ nuclei as emerging from a leading-order pionless effective field theory Hamiltonian.
arXiv Detail & Related papers (2020-07-28T14:52:28Z) - Optimal Learning with Excitatory and Inhibitory synapses [91.3755431537592]
I study the problem of storing associations between analog signals in the presence of correlations.
I characterize the typical learning performance in terms of the power spectrum of random input and output processes.
arXiv Detail & Related papers (2020-05-25T18:25:54Z) - Towards Efficient Processing and Learning with Spikes: New Approaches
for Multi-Spike Learning [59.249322621035056]
We propose two new multi-spike learning rules which demonstrate better performance over other baselines on various tasks.
In the feature detection task, we re-examine the ability of unsupervised STDP with its limitations being presented.
Our proposed learning rules can reliably solve the task over a wide range of conditions without specific constraints being applied.
arXiv Detail & Related papers (2020-05-02T06:41:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.