Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses
- URL: http://arxiv.org/abs/2106.05686v1
- Date: Thu, 10 Jun 2021 12:04:03 GMT
- Title: Spatiotemporal Spike-Pattern Selectivity in Single Mixed-Signal Neurons
with Balanced Synapses
- Authors: Mattias Nilsson, Foteini Liwicki, and Fredrik Sandin
- Abstract summary: Mixed-signal neuromorphic processors could be used for inference and learning.
We show how inhomogeneous synaptic circuits could be utilized for resource-efficient implementation of network layers.
- Score: 0.27998963147546135
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Realizing the potential of mixed-signal neuromorphic processors for
ultra-low-power inference and learning requires efficient use of their
inhomogeneous analog circuitry as well as sparse, time-based information
encoding and processing. Here, we investigate spike-timing-based spatiotemporal
receptive fields of output-neurons in the Spatiotemporal Correlator (STC)
network, for which we used excitatory-inhibitory balanced disynaptic inputs
instead of dedicated axonal or neuronal delays. We present hardware-in-the-loop
experiments with a mixed-signal DYNAP-SE neuromorphic processor, in which
five-dimensional receptive fields of hardware neurons were mapped by randomly
sampling input spike-patterns from a uniform distribution. We find that, when
the balanced disynaptic elements are randomly programmed, some of the neurons
display distinct receptive fields. Furthermore, we demonstrate how a neuron was
tuned to detect a particular spatiotemporal feature, to which it initially was
non-selective, by activating a different subset of the inhomogeneous analog
synaptic circuits. The energy dissipation of the balanced synaptic elements is
one order of magnitude lower per lateral connection (0.65 nJ vs 9.3 nJ per
spike) than former delay-based neuromorphic hardware implementations. Thus, we
show how the inhomogeneous synaptic circuits could be utilized for
resource-efficient implementation of STC network layers, in a way that enables
synapse-address reprogramming as a discrete mechanism for feature tuning.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Synchronized Stepwise Control of Firing and Learning Thresholds in a Spiking Randomly Connected Neural Network toward Hardware Implementation [0.0]
We propose hardware-oriented models of intrinsic plasticity (IP) and synaptic plasticity (SP) for spiking randomly connected neural network (RNN)
We demonstrate the effectiveness of our model through simulations of temporal data learning and anomaly detection with a spiking RNN using publicly available electrocardiograms.
arXiv Detail & Related papers (2024-04-26T08:26:10Z) - Neuromorphic Hebbian learning with magnetic tunnel junction synapses [41.92764939721262]
We propose and experimentally demonstrate neuromorphic networks that provide high-accuracy inference thanks to the binary resistance states of magnetic tunnel junctions (MTJs)
We performed the first demonstration of a neuromorphic network directly implemented with MTJ synapses, for both inference and spike-timing-dependent plasticity learning.
We also demonstrated through simulation that the proposed system for unsupervised Hebbian learning with STT-MTJ synapses can achieve competitive accuracies for MNIST handwritten digit recognition.
arXiv Detail & Related papers (2023-08-21T19:58:44Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Efficient Neuromorphic Signal Processing with Loihi 2 [6.32784133039548]
We show how Resonate-and-Firetemporal (RF) neurons can be used to compute the Short Time Fourier Transform (STFT) with similar computational complexity but 47x less output bandwidth than the conventional STFT.
We also demonstrate promising preliminary results using backpropagation to train RF neurons for audio classification tasks.
arXiv Detail & Related papers (2021-11-05T22:37:05Z) - Deep Metric Learning with Locality Sensitive Angular Loss for
Self-Correcting Source Separation of Neural Spiking Signals [77.34726150561087]
We propose a methodology based on deep metric learning to address the need for automated post-hoc cleaning and robust separation filters.
We validate this method with an artificially corrupted label set based on source-separated high-density surface electromyography recordings.
This approach enables a neural network to learn to accurately decode neurophysiological time series using any imperfect method of labelling the signal.
arXiv Detail & Related papers (2021-10-13T21:51:56Z) - Astrocytes mediate analogous memory in a multi-layer neuron-astrocytic
network [52.77024349608834]
We show how a piece of information can be maintained as a robust activity pattern for several seconds then completely disappear if no other stimuli come.
This kind of short-term memory can keep operative information for seconds, then completely forget it to avoid overlapping with forthcoming patterns.
We show how arbitrary patterns can be loaded, then stored for a certain interval of time, and retrieved if the appropriate clue pattern is applied to the input.
arXiv Detail & Related papers (2021-08-31T16:13:15Z) - Multi-Tones' Phase Coding (MTPC) of Interaural Time Difference by
Spiking Neural Network [68.43026108936029]
We propose a pure spiking neural network (SNN) based computational model for precise sound localization in the noisy real-world environment.
We implement this algorithm in a real-time robotic system with a microphone array.
The experiment results show a mean error azimuth of 13 degrees, which surpasses the accuracy of the other biologically plausible neuromorphic approach for sound source localization.
arXiv Detail & Related papers (2020-07-07T08:22:56Z) - Optimal Learning with Excitatory and Inhibitory synapses [91.3755431537592]
I study the problem of storing associations between analog signals in the presence of correlations.
I characterize the typical learning performance in terms of the power spectrum of random input and output processes.
arXiv Detail & Related papers (2020-05-25T18:25:54Z) - Synaptic Integration of Spatiotemporal Features with a Dynamic
Neuromorphic Processor [0.1529342790344802]
We show that a single point-neuron with dynamic synapses in the DYNAP-SENAP can respond selectively to presynaptic spikes with a particulartemporal structure.
This structure enables, for instance, visual feature tuning of single neurons.
arXiv Detail & Related papers (2020-02-12T11:26:35Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.