Event-Driven Tactile Learning with Various Location Spiking Neurons
- URL: http://arxiv.org/abs/2210.04277v2
- Date: Tue, 11 Oct 2022 02:34:25 GMT
- Title: Event-Driven Tactile Learning with Various Location Spiking Neurons
- Authors: Peng Kang, Srutarshi Banerjee, Henry Chopp, Aggelos Katsaggelos,
Oliver Cossairt
- Abstract summary: Event-driven learning is still in its infancy due to the limited representation abilities of existing spiking neurons.
We propose a novel "location spiking neuron" model, which enables us to extract features of event-based data in a novel way.
By exploiting the novel location spiking neurons, we propose several models to capture complex tactile-temporal dependencies in the event-driven data.
- Score: 5.822511654546528
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tactile sensing is essential for a variety of daily tasks. New advances in
event-driven tactile sensors and Spiking Neural Networks (SNNs) spur the
research in related fields. However, SNN-enabled event-driven tactile learning
is still in its infancy due to the limited representation abilities of existing
spiking neurons and high spatio-temporal complexity in the data. In this paper,
to improve the representation capability of existing spiking neurons, we
propose a novel neuron model called "location spiking neuron", which enables us
to extract features of event-based data in a novel way. Specifically, based on
the classical Time Spike Response Model (TSRM), we develop the Location Spike
Response Model (LSRM). In addition, based on the most commonly-used Time Leaky
Integrate-and-Fire (TLIF) model, we develop the Location Leaky
Integrate-and-Fire (LLIF) model. By exploiting the novel location spiking
neurons, we propose several models to capture the complex spatio-temporal
dependencies in the event-driven tactile data. Extensive experiments
demonstrate the significant improvements of our models over other works on
event-driven tactile learning and show the superior energy efficiency of our
models and location spiking neurons, which may unlock their potential on
neuromorphic hardware.
Related papers
- Autaptic Synaptic Circuit Enhances Spatio-temporal Predictive Learning of Spiking Neural Networks [23.613277062707844]
Spiking Neural Networks (SNNs) emulate the integrated-fire-leak mechanism found in biological neurons.
Existing SNNs predominantly rely on the Integrate-and-Fire Leaky (LIF) model.
This paper proposes a novel S-patioTemporal Circuit (STC) model.
arXiv Detail & Related papers (2024-06-01T11:17:27Z) - WaLiN-GUI: a graphical and auditory tool for neuron-based encoding [73.88751967207419]
Neuromorphic computing relies on spike-based, energy-efficient communication.
We develop a tool to identify suitable configurations for neuron-based encoding of sample-based data into spike trains.
The WaLiN-GUI is provided open source and with documentation.
arXiv Detail & Related papers (2023-10-25T20:34:08Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Event-Driven Tactile Learning with Location Spiking Neurons [5.822511654546528]
Spiking Neural Networks (SNNs) enable event-driven tactile learning.
We develop a novel neuron model called "location spiking neuron"
We show the superior energy efficiency of our models over other works on event-driven learning.
arXiv Detail & Related papers (2022-07-23T12:15:43Z) - Simple and complex spiking neurons: perspectives and analysis in a
simple STDP scenario [0.7829352305480283]
Spiking neural networks (SNNs) are inspired by biology and neuroscience to create fast and efficient learning systems.
This work considers various neuron models in the literature and then selects computational neuron models that are single-variable, efficient, and display different types of complexities.
We make a comparative study of three simple I&F neuron models, namely the LIF, the Quadratic I&F (QIF) and the Exponential I&F (EIF), to understand whether the use of more complex models increases the performance of the system.
arXiv Detail & Related papers (2022-06-28T10:01:51Z) - STNDT: Modeling Neural Population Activity with a Spatiotemporal
Transformer [19.329190789275565]
We introduce SpatioTemporal Neural Data Transformer (STNDT), an NDT-based architecture that explicitly models responses of individual neurons.
We show that our model achieves state-of-the-art performance on ensemble level in estimating neural activities across four neural datasets.
arXiv Detail & Related papers (2022-06-09T18:54:23Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Continuous Learning and Adaptation with Membrane Potential and
Activation Threshold Homeostasis [91.3755431537592]
This paper presents the Membrane Potential and Activation Threshold Homeostasis (MPATH) neuron model.
The model allows neurons to maintain a form of dynamic equilibrium by automatically regulating their activity when presented with input.
Experiments demonstrate the model's ability to adapt to and continually learn from its input.
arXiv Detail & Related papers (2021-04-22T04:01:32Z) - Neuromorphic Algorithm-hardware Codesign for Temporal Pattern Learning [11.781094547718595]
We derive an efficient training algorithm for Leaky Integrate and Fire neurons, which is capable of training a SNN to learn complex spatial temporal patterns.
We have developed a CMOS circuit implementation for a memristor-based network of neuron and synapses which retains critical neural dynamics with reduced complexity.
arXiv Detail & Related papers (2021-04-21T18:23:31Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.