SpikeMS: Deep Spiking Neural Network for Motion Segmentation
- URL: http://arxiv.org/abs/2105.06562v1
- Date: Thu, 13 May 2021 21:34:55 GMT
- Title: SpikeMS: Deep Spiking Neural Network for Motion Segmentation
- Authors: Chethan M. Parameshwara, Simin Li, Cornelia Ferm\"uller, Nitin J.
Sanket, Matthew S. Evanusa, Yiannis Aloimonos
- Abstract summary: textitSpikeMS is the first deep encoder-decoder SNN architecture for the real-world large-scale problem of motion segmentation.
We show that textitSpikeMS is capable of textitincremental predictions, or predictions from smaller amounts of test data than it is trained on.
- Score: 7.491944503744111
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Spiking Neural Networks (SNN) are the so-called third generation of neural
networks which attempt to more closely match the functioning of the biological
brain. They inherently encode temporal data, allowing for training with less
energy usage and can be extremely energy efficient when coded on neuromorphic
hardware. In addition, they are well suited for tasks involving event-based
sensors, which match the event-based nature of the SNN. However, SNNs have not
been as effectively applied to real-world, large-scale tasks as standard
Artificial Neural Networks (ANNs) due to the algorithmic and training
complexity. To exacerbate the situation further, the input representation is
unconventional and requires careful analysis and deep understanding. In this
paper, we propose \textit{SpikeMS}, the first deep encoder-decoder SNN
architecture for the real-world large-scale problem of motion segmentation
using the event-based DVS camera as input. To accomplish this, we introduce a
novel spatio-temporal loss formulation that includes both spike counts and
classification labels in conjunction with the use of new techniques for SNN
backpropagation. In addition, we show that \textit{SpikeMS} is capable of
\textit{incremental predictions}, or predictions from smaller amounts of test
data than it is trained on. This is invaluable for providing outputs even with
partial input data for low-latency applications and those requiring fast
predictions. We evaluated \textit{SpikeMS} on challenging synthetic and
real-world sequences from EV-IMO, EED and MOD datasets and achieving results on
a par with a comparable ANN method, but using potentially 50 times less power.
Related papers
- EvSegSNN: Neuromorphic Semantic Segmentation for Event Data [0.6138671548064356]
EvSegSNN is a biologically plausible encoder-decoder U-shaped architecture relying on Parametric Leaky Integrate and Fire neurons.
We introduce an end-to-end biologically inspired semantic segmentation approach by combining Spiking Neural Networks with event cameras.
Experiments conducted on DDD17 demonstrate that EvSegSNN outperforms the closest state-of-the-art model in terms of MIoU.
arXiv Detail & Related papers (2024-06-20T10:36:24Z) - An Automata-Theoretic Approach to Synthesizing Binarized Neural Networks [13.271286153792058]
Quantized neural networks (QNNs) have been developed, with binarized neural networks (BNNs) restricted to binary values as a special case.
This paper presents an automata-theoretic approach to synthesizing BNNs that meet designated properties.
arXiv Detail & Related papers (2023-07-29T06:27:28Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Adaptive-SpikeNet: Event-based Optical Flow Estimation using Spiking
Neural Networks with Learnable Neuronal Dynamics [6.309365332210523]
Spiking Neural Networks (SNNs) with their neuro-inspired event-driven processing can efficiently handle asynchronous data.
We propose an adaptive fully-spiking framework with learnable neuronal dynamics to alleviate the spike vanishing problem.
Our experiments on datasets show an average reduction of 13% in average endpoint error (AEE) compared to state-of-the-art ANNs.
arXiv Detail & Related papers (2022-09-21T21:17:56Z) - Object Detection with Spiking Neural Networks on Automotive Event Data [0.0]
We propose to train spiking neural networks (SNNs) directly on data coming from event cameras to design fast and efficient automotive embedded applications.
In this paper, we conducted experiments on two automotive event datasets, establishing new state-of-the-art classification results for spiking neural networks.
arXiv Detail & Related papers (2022-05-09T14:39:47Z) - Training High-Performance Low-Latency Spiking Neural Networks by
Differentiation on Spike Representation [70.75043144299168]
Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware.
It is a challenge to efficiently train SNNs due to their non-differentiability.
We propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance.
arXiv Detail & Related papers (2022-05-01T12:44:49Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Self-Supervised Learning of Event-Based Optical Flow with Spiking Neural
Networks [3.7384509727711923]
A major challenge for neuromorphic computing is that learning algorithms for traditional artificial neural networks (ANNs) do not transfer directly to spiking neural networks (SNNs)
In this article, we focus on the self-supervised learning problem of optical flow estimation from event-based camera inputs.
We show that the performance of the proposed ANNs and SNNs are on par with that of the current state-of-the-art ANNs trained in a self-supervised manner.
arXiv Detail & Related papers (2021-06-03T14:03:41Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.