Event-Driven Tactile Learning with Location Spiking Neurons
- URL: http://arxiv.org/abs/2209.01080v1
- Date: Sat, 23 Jul 2022 12:15:43 GMT
- Title: Event-Driven Tactile Learning with Location Spiking Neurons
- Authors: Peng Kang, Srutarshi Banerjee, Henry Chopp, Aggelos Katsaggelos,
Oliver Cossairt
- Abstract summary: Spiking Neural Networks (SNNs) enable event-driven tactile learning.
We develop a novel neuron model called "location spiking neuron"
We show the superior energy efficiency of our models over other works on event-driven learning.
- Score: 5.822511654546528
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The sense of touch is essential for a variety of daily tasks. New advances in
event-based tactile sensors and Spiking Neural Networks (SNNs) spur the
research in event-driven tactile learning. However, SNN-enabled event-driven
tactile learning is still in its infancy due to the limited representative
abilities of existing spiking neurons and high spatio-temporal complexity in
the data. In this paper, to improve the representative capabilities of existing
spiking neurons, we propose a novel neuron model called "location spiking
neuron", which enables us to extract features of event-based data in a novel
way. Moreover, based on the classical Time Spike Response Model (TSRM), we
develop a specific location spiking neuron model - Location Spike Response
Model (LSRM) that serves as a new building block of SNNs. Furthermore, we
propose a hybrid model which combines an SNN with TSRM neurons and an SNN with
LSRM neurons to capture the complex spatio-temporal dependencies in the data.
Extensive experiments demonstrate the significant improvements of our models
over other works on event-driven tactile learning and show the superior energy
efficiency of our models and location spiking neurons, which may unlock their
potential on neuromorphic hardware.
Related papers
- Enhancing learning in spiking neural networks through neuronal heterogeneity and neuromodulatory signaling [52.06722364186432]
We propose a biologically-informed framework for enhancing artificial neural networks (ANNs)
Our proposed dual-framework approach highlights the potential of spiking neural networks (SNNs) for emulating diverse spiking behaviors.
We outline how the proposed approach integrates brain-inspired compartmental models and task-driven SNNs, bioinspiration and complexity.
arXiv Detail & Related papers (2024-07-05T14:11:28Z) - Co-learning synaptic delays, weights and adaptation in spiking neural
networks [0.0]
Spiking neural networks (SNN) distinguish themselves from artificial neural networks (ANN) because of their inherent temporal processing and spike-based computations.
We show that data processing with spiking neurons can be enhanced by co-learning the connection weights with two other biologically inspired neuronal features.
arXiv Detail & Related papers (2023-09-12T09:13:26Z) - The Expressive Leaky Memory Neuron: an Efficient and Expressive Phenomenological Neuron Model Can Solve Long-Horizon Tasks [64.08042492426992]
We introduce the Expressive Memory (ELM) neuron model, a biologically inspired model of a cortical neuron.
Our ELM neuron can accurately match the aforementioned input-output relationship with under ten thousand trainable parameters.
We evaluate it on various tasks with demanding temporal structures, including the Long Range Arena (LRA) datasets.
arXiv Detail & Related papers (2023-06-14T13:34:13Z) - Event-Driven Tactile Learning with Various Location Spiking Neurons [5.822511654546528]
Event-driven learning is still in its infancy due to the limited representation abilities of existing spiking neurons.
We propose a novel "location spiking neuron" model, which enables us to extract features of event-based data in a novel way.
By exploiting the novel location spiking neurons, we propose several models to capture complex tactile-temporal dependencies in the event-driven data.
arXiv Detail & Related papers (2022-10-09T14:49:27Z) - A Synapse-Threshold Synergistic Learning Approach for Spiking Neural
Networks [1.8556712517882232]
Spiking neural networks (SNNs) have demonstrated excellent capabilities in various intelligent scenarios.
In this study, we develop a novel synergistic learning approach that involves simultaneously training synaptic weights and spike thresholds in SNNs.
arXiv Detail & Related papers (2022-06-10T06:41:36Z) - SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network [12.237928453571636]
Spiking Neural Networks (SNNs) have piqued researchers' interest because of their capacity to process temporal information and low power consumption.
Current state-of-the-art methods limited their biological plausibility and performance because their neurons are generally built on the simple Leaky-Integrate-and-Fire (LIF) model.
Due to the high level of dynamic complexity, modern neuron models have seldom been implemented in SNN practice.
arXiv Detail & Related papers (2022-03-30T07:50:44Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - BackEISNN: A Deep Spiking Neural Network with Adaptive Self-Feedback and
Balanced Excitatory-Inhibitory Neurons [8.956708722109415]
Spiking neural networks (SNNs) transmit information through discrete spikes, which performs well in processing spatial-temporal information.
We propose a deep spiking neural network with adaptive self-feedback and balanced excitatory and inhibitory neurons (BackEISNN)
For the MNIST, FashionMNIST, and N-MNIST datasets, our model has achieved state-of-the-art performance.
arXiv Detail & Related papers (2021-05-27T08:38:31Z) - Neuroevolution of a Recurrent Neural Network for Spatial and Working
Memory in a Simulated Robotic Environment [57.91534223695695]
We evolved weights in a biologically plausible recurrent neural network (RNN) using an evolutionary algorithm to replicate the behavior and neural activity observed in rats.
Our method demonstrates how the dynamic activity in evolved RNNs can capture interesting and complex cognitive behavior.
arXiv Detail & Related papers (2021-02-25T02:13:52Z) - Flexible Transmitter Network [84.90891046882213]
Current neural networks are mostly built upon the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons.
We propose the Flexible Transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity.
We present the Flexible Transmitter Network (FTNet), which is built on the most common fully-connected feed-forward architecture.
arXiv Detail & Related papers (2020-04-08T06:55:12Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.