Efficient Neuromorphic Signal Processing with Loihi 2
- URL: http://arxiv.org/abs/2111.03746v1
- Date: Fri, 5 Nov 2021 22:37:05 GMT
- Title: Efficient Neuromorphic Signal Processing with Loihi 2
- Authors: Garrick Orchard, E. Paxon Frady, Daniel Ben Dayan Rubin, Sophia
Sanborn, Sumit Bam Shrestha, Friedrich T. Sommer, and Mike Davies
- Abstract summary: We show how Resonate-and-Firetemporal (RF) neurons can be used to compute the Short Time Fourier Transform (STFT) with similar computational complexity but 47x less output bandwidth than the conventional STFT.
We also demonstrate promising preliminary results using backpropagation to train RF neurons for audio classification tasks.
- Score: 6.32784133039548
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The biologically inspired spiking neurons used in neuromorphic computing are
nonlinear filters with dynamic state variables -- very different from the
stateless neuron models used in deep learning. The next version of Intel's
neuromorphic research processor, Loihi 2, supports a wide range of stateful
spiking neuron models with fully programmable dynamics. Here we showcase
advanced spiking neuron models that can be used to efficiently process
streaming data in simulation experiments on emulated Loihi 2 hardware. In one
example, Resonate-and-Fire (RF) neurons are used to compute the Short Time
Fourier Transform (STFT) with similar computational complexity but 47x less
output bandwidth than the conventional STFT. In another example, we describe an
algorithm for optical flow estimation using spatiotemporal RF neurons that
requires over 90x fewer operations than a conventional DNN-based solution. We
also demonstrate promising preliminary results using backpropagation to train
RF neurons for audio classification tasks. Finally, we show that a cascade of
Hopf resonators - a variant of the RF neuron - replicates novel properties of
the cochlea and motivates an efficient spike-based spectrogram encoder.
Related papers
- Addressing the speed-accuracy simulation trade-off for adaptive spiking
neurons [0.0]
We present an algorithmically reinterpreted the adaptive integrate-and-fire (ALIF) model.
We obtain over a $50times$ training speedup using small DTs on synthetic benchmarks.
We also showcase how our model makes it possible to quickly and accurately fit real electrophysiological recordings of cortical neurons.
arXiv Detail & Related papers (2023-11-19T18:21:45Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - An accurate and flexible analog emulation of AdEx neuron dynamics in
silicon [0.0]
This manuscript presents the analog neuron circuits of the mixed-signal accelerated neuromorphic system BrainScaleS-2.
They are capable of flexibly and accurately emulating the adaptive exponential integrate-and-fire model equations in combination with both current- and conductance-based synapses.
arXiv Detail & Related papers (2022-09-19T18:08:23Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Mapping and Validating a Point Neuron Model on Intel's Neuromorphic
Hardware Loihi [77.34726150561087]
We investigate the potential of Intel's fifth generation neuromorphic chip - Loihi'
Loihi is based on the novel idea of Spiking Neural Networks (SNNs) emulating the neurons in the brain.
We find that Loihi replicates classical simulations very efficiently and scales notably well in terms of both time and energy performance as the networks get larger.
arXiv Detail & Related papers (2021-09-22T16:52:51Z) - Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses [0.27998963147546135]
Mixed-signal neuromorphic processors could be used for inference and learning.
We show how inhomogeneous synaptic circuits could be utilized for resource-efficient implementation of network layers.
arXiv Detail & Related papers (2021-06-10T12:04:03Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.