Event-Based Angular Velocity Regression with Spiking Networks
- URL: http://arxiv.org/abs/2003.02790v1
- Date: Thu, 5 Mar 2020 17:37:16 GMT
- Title: Event-Based Angular Velocity Regression with Spiking Networks
- Authors: Mathias Gehrig, Sumit Bam Shrestha, Daniel Mouritzen and Davide
Scaramuzza
- Abstract summary: Spiking Neural Networks (SNNs) process information conveyed as temporal spikes rather than numeric values.
We propose, for the first time, a temporal regression problem of numerical values given events from an event camera.
We show that we can successfully train an SNN to perform angular velocity regression.
- Score: 51.145071093099396
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spiking Neural Networks (SNNs) are bio-inspired networks that process
information conveyed as temporal spikes rather than numeric values. A spiking
neuron of an SNN only produces a spike whenever a significant number of spikes
occur within a short period of time. Due to their spike-based computational
model, SNNs can process output from event-based, asynchronous sensors without
any pre-processing at extremely lower power unlike standard artificial neural
networks. This is possible due to specialized neuromorphic hardware that
implements the highly-parallelizable concept of SNNs in silicon. Yet, SNNs have
not enjoyed the same rise of popularity as artificial neural networks. This not
only stems from the fact that their input format is rather unconventional but
also due to the challenges in training spiking networks. Despite their temporal
nature and recent algorithmic advances, they have been mostly evaluated on
classification problems. We propose, for the first time, a temporal regression
problem of numerical values given events from an event camera. We specifically
investigate the prediction of the 3-DOF angular velocity of a rotating event
camera with an SNN. The difficulty of this problem arises from the prediction
of angular velocities continuously in time directly from irregular,
asynchronous event-based input. Directly utilising the output of event cameras
without any pre-processing ensures that we inherit all the benefits that they
provide over conventional cameras. That is high-temporal resolution,
high-dynamic range and no motion blur. To assess the performance of SNNs on
this task, we introduce a synthetic event camera dataset generated from
real-world panoramic images and show that we can successfully train an SNN to
perform angular velocity regression.
Related papers
- SkipSNN: Efficiently Classifying Spike Trains with Event-attention [29.639889737632842]
Spike train classification has recently become an important topic in the machine learning community.
A promising model for it should follow the design principle of performing intensive computation only when signals of interest appear.
This paper introduces an event-attention mechanism that enables SNNs to dynamically highlight useful signals of the original spike trains.
arXiv Detail & Related papers (2024-10-29T03:19:25Z) - Fast-SNN: Fast Spiking Neural Network by Converting Quantized ANN [38.18008827711246]
Spiking neural networks (SNNs) have shown advantages in computation and energy efficiency.
It remains a challenge to train deep SNNs due to the discrete spike function.
This paper proposes Fast-SNN that achieves high performance with low latency.
arXiv Detail & Related papers (2023-05-31T14:04:41Z) - MSS-DepthNet: Depth Prediction with Multi-Step Spiking Neural Network [8.53512216864715]
Spiking neural network is a novel event-based computational paradigm that is considered to be well suited for processing event camera tasks.
This work proposes a spiking neural network architecture with a novel residual block designed and multi-dimension attention modules combined.
This model outperforms previous ANN networks of the same size on the MVSEC dataset and shows great computational efficiency.
arXiv Detail & Related papers (2022-11-22T10:35:36Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - AEGNN: Asynchronous Event-based Graph Neural Networks [54.528926463775946]
Event-based Graph Neural Networks generalize standard GNNs to process events as "evolving"-temporal graphs.
AEGNNs are easily trained on synchronous inputs and can be converted to efficient, "asynchronous" networks at test time.
arXiv Detail & Related papers (2022-03-31T16:21:12Z) - Event-based Video Reconstruction via Potential-assisted Spiking Neural
Network [48.88510552931186]
Bio-inspired neural networks can potentially lead to greater computational efficiency on event-driven hardware.
We propose a novel Event-based Video reconstruction framework based on a fully Spiking Neural Network (EVSNN)
We find that the spiking neurons have the potential to store useful temporal information (memory) to complete such time-dependent tasks.
arXiv Detail & Related papers (2022-01-25T02:05:20Z) - Training Feedback Spiking Neural Networks by Implicit Differentiation on
the Equilibrium State [66.2457134675891]
Spiking neural networks (SNNs) are brain-inspired models that enable energy-efficient implementation on neuromorphic hardware.
Most existing methods imitate the backpropagation framework and feedforward architectures for artificial neural networks.
We propose a novel training method that does not rely on the exact reverse of the forward computation.
arXiv Detail & Related papers (2021-09-29T07:46:54Z) - Fully Spiking Variational Autoencoder [66.58310094608002]
Spiking neural networks (SNNs) can be run on neuromorphic devices with ultra-high speed and ultra-low energy consumption.
In this study, we build a variational autoencoder (VAE) with SNN to enable image generation.
arXiv Detail & Related papers (2021-09-26T06:10:14Z) - SpikeMS: Deep Spiking Neural Network for Motion Segmentation [7.491944503744111]
textitSpikeMS is the first deep encoder-decoder SNN architecture for the real-world large-scale problem of motion segmentation.
We show that textitSpikeMS is capable of textitincremental predictions, or predictions from smaller amounts of test data than it is trained on.
arXiv Detail & Related papers (2021-05-13T21:34:55Z) - Spiking Neural Networks with Single-Spike Temporal-Coded Neurons for
Network Intrusion Detection [6.980076213134383]
Spiking neural network (SNN) is interesting due to its strong bio-plausibility and high energy efficiency.
However, its performance is falling far behind conventional deep neural networks (DNNs)
arXiv Detail & Related papers (2020-10-15T14:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.