Pattern recognition using spiking antiferromagnetic neurons
- URL: http://arxiv.org/abs/2308.09071v3
- Date: Tue, 30 Jul 2024 17:28:56 GMT
- Title: Pattern recognition using spiking antiferromagnetic neurons
- Authors: Hannah Bradley, Steven Louis, Andrei Slavin, Vasyl Tyberkevych,
- Abstract summary: We train an artificial neural network of AFM neurons to perform pattern recognition.
A simple machine learning algorithm called spike pattern association neuron (SPAN), which relies on the temporal position of neuron spikes, is used during training.
In under a microsecond of physical time, the AFM neural network is trained to recognize symbols composed from a grid by producing a spike within a specified time window.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spintronic devices offer a promising avenue for the development of nanoscale, energy-efficient artificial neurons for neuromorphic computing. It has previously been shown that with antiferromagnetic (AFM) oscillators, ultra-fast spiking artificial neurons can be made that mimic many unique features of biological neurons. In this work, we train an artificial neural network of AFM neurons to perform pattern recognition. A simple machine learning algorithm called spike pattern association neuron (SPAN), which relies on the temporal position of neuron spikes, is used during training. In under a microsecond of physical time, the AFM neural network is trained to recognize symbols composed from a grid by producing a spike within a specified time window. We further achieve multi-symbol recognition with the addition of an output layer to suppress undesirable spikes. Through the utilization of AFM neurons and the SPAN algorithm, we create a neural network capable of high-accuracy recognition with overall power consumption on the order of picojoules.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Expressivity of Spiking Neural Networks [15.181458163440634]
We study the capabilities of spiking neural networks where information is encoded in the firing time of neurons.
In contrast to ReLU networks, we prove that spiking neural networks can realize both continuous and discontinuous functions.
arXiv Detail & Related papers (2023-08-16T08:45:53Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Complex Dynamic Neurons Improved Spiking Transformer Network for
Efficient Automatic Speech Recognition [8.998797644039064]
The spiking neural network (SNN) using leaky-integrated-and-fire (LIF) neurons has been commonly used in automatic speech recognition (ASR) tasks.
Here we introduce four types of neuronal dynamics to post-process the sequential patterns generated from the spiking transformer.
We found that the DyTr-SNN could handle the non-toy automatic speech recognition task well, representing a lower phoneme error rate, lower computational cost, and higher robustness.
arXiv Detail & Related papers (2023-02-02T16:20:27Z) - Constraints on the design of neuromorphic circuits set by the properties
of neural population codes [61.15277741147157]
In the brain, information is encoded, transmitted and used to inform behaviour.
Neuromorphic circuits need to encode information in a way compatible to that used by populations of neuron in the brain.
arXiv Detail & Related papers (2022-12-08T15:16:04Z) - Spiking neural network for nonlinear regression [68.8204255655161]
Spiking neural networks carry the potential for a massive reduction in memory and energy consumption.
They introduce temporal and neuronal sparsity, which can be exploited by next-generation neuromorphic hardware.
A framework for regression using spiking neural networks is proposed.
arXiv Detail & Related papers (2022-10-06T13:04:45Z) - Low Power Neuromorphic EMG Gesture Classification [3.8761525368152725]
Spiking Neural Networks (SNNs) are promising for low-power, real-time EMG gesture recognition.
We present low-power, high accuracy demonstration of EMG-signal based gesture recognition using neuromorphic Recurrent Spiking Neural Networks (RSNN)
Our network achieves state-of-the-art accuracy classification (90%) while using 53% than best reported art on Roshambo EMG dataset.
arXiv Detail & Related papers (2022-06-04T22:09:34Z) - POPPINS : A Population-Based Digital Spiking Neuromorphic Processor with
Integer Quadratic Integrate-and-Fire Neurons [50.591267188664666]
We propose a population-based digital spiking neuromorphic processor in 180nm process technology with two hierarchy populations.
The proposed approach enables the developments of biomimetic neuromorphic system and various low-power, and low-latency inference processing applications.
arXiv Detail & Related papers (2022-01-19T09:26:34Z) - Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses [0.27998963147546135]
Mixed-signal neuromorphic processors could be used for inference and learning.
We show how inhomogeneous synaptic circuits could be utilized for resource-efficient implementation of network layers.
arXiv Detail & Related papers (2021-06-10T12:04:03Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.