Event-driven Vision and Control for UAVs on a Neuromorphic Chip
- URL: http://arxiv.org/abs/2108.03694v1
- Date: Sun, 8 Aug 2021 17:46:52 GMT
- Title: Event-driven Vision and Control for UAVs on a Neuromorphic Chip
- Authors: Antonio Vitale, Alpha Renner, Celine Nauer, Davide Scaramuzza, and
Yulia Sandamirskaya
- Abstract summary: Event-based cameras produce a sparse stream of events that can be processed more efficiently and with a lower latency than images.
We show how an event-based vision algorithm can be implemented as a spiking neuronal network on a neuromorphic chip and used in a drone controller.
Our spiking neuronal network on chip is the first example of a neuromorphic vision-based controller solving a high-speed UAV control task.
- Score: 41.733091458634874
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Event-based vision sensors achieve up to three orders of magnitude better
speed vs. power consumption trade off in high-speed control of UAVs compared to
conventional image sensors. Event-based cameras produce a sparse stream of
events that can be processed more efficiently and with a lower latency than
images, enabling ultra-fast vision-driven control. Here, we explore how an
event-based vision algorithm can be implemented as a spiking neuronal network
on a neuromorphic chip and used in a drone controller. We show how seamless
integration of event-based perception on chip leads to even faster control
rates and lower latency. In addition, we demonstrate how online adaptation of
the SNN controller can be realised using on-chip learning. Our spiking neuronal
network on chip is the first example of a neuromorphic vision-based controller
solving a high-speed UAV control task. The excellent scalability of processing
in neuromorphic hardware opens the possibility to solve more challenging visual
tasks in the future and integrate visual perception in fast control loops.
Related papers
- Integration of Communication and Computational Imaging [49.2442836992307]
We propose a novel framework that integrates communication and computational imaging (ICCI) for remote perception.
ICCI framework performs a full-link information transfer optimization, aiming to minimize information loss from the generation of the information source to the execution of the final vision tasks.
An 80 km 27-band hyperspectral video perception with a rate of 30 fps is experimentally achieved.
arXiv Detail & Related papers (2024-10-25T09:19:59Z) - EvGNN: An Event-driven Graph Neural Network Accelerator for Edge Vision [0.06752396542927405]
Event-driven graph neural networks (GNNs) have emerged as a promising solution for sparse event-based vision.
We propose EvGNN, the first event-driven GNN accelerator for low-footprint, ultra-low-latency, and high-accuracy edge vision.
arXiv Detail & Related papers (2024-04-30T12:18:47Z) - EV-Catcher: High-Speed Object Catching Using Low-latency Event-based
Neural Networks [107.62975594230687]
We demonstrate an application where event cameras excel: accurately estimating the impact location of fast-moving objects.
We introduce a lightweight event representation called Binary Event History Image (BEHI) to encode event data at low latency.
We show that the system is capable of achieving a success rate of 81% in catching balls targeted at different locations, with a velocity of up to 13 m/s even on compute-constrained embedded platforms.
arXiv Detail & Related papers (2023-04-14T15:23:28Z) - Neuromorphic Optical Flow and Real-time Implementation with Event
Cameras [47.11134388304464]
We build on the latest developments in event-based vision and spiking neural networks.
We propose a new network architecture that improves the state-of-the-art self-supervised optical flow accuracy.
We demonstrate high speed optical flow prediction with almost two orders of magnitude reduced complexity.
arXiv Detail & Related papers (2023-04-14T14:03:35Z) - EDeNN: Event Decay Neural Networks for low latency vision [26.784944204163363]
We develop a new type of neural network which operates closer to the original event data stream.
We demonstrate state-of-the-art performance in angular velocity regression and competitive optical flow estimation.
arXiv Detail & Related papers (2022-09-09T15:51:39Z) - Visual Attention Network [90.0753726786985]
We propose a novel large kernel attention (LKA) module to enable self-adaptive and long-range correlations in self-attention.
We also introduce a novel neural network based on LKA, namely Visual Attention Network (VAN)
VAN outperforms the state-of-the-art vision transformers and convolutional neural networks with a large margin in extensive experiments.
arXiv Detail & Related papers (2022-02-20T06:35:18Z) - 1000x Faster Camera and Machine Vision with Ordinary Devices [76.46540270145698]
We present vidar, a bit sequence array where each bit represents whether the accumulation of photons has reached a threshold.
We have developed a vidar camera that is 1,000x faster than conventional cameras.
We have also developed a spiking neural network-based machine vision system that combines the speed of the machine and the mechanism of biological vision.
arXiv Detail & Related papers (2022-01-23T16:10:11Z) - Adversarial Attacks on Spiking Convolutional Networks for Event-based
Vision [0.6999740786886537]
We show how white-box adversarial attack algorithms can be adapted to the discrete and sparse nature of event-based visual data.
We also verify, for the first time, the effectiveness of these perturbations directly on neuromorphic hardware.
arXiv Detail & Related papers (2021-10-06T17:20:05Z) - Evolved neuromorphic radar-based altitude controller for an autonomous
open-source blimp [4.350434044677268]
In this paper, we propose an evolved altitude controller based on an SNN for a robotic airship.
We also present an SNN-based controller architecture, an evolutionary framework for training the network in a simulated environment, and a control strategy for ameliorating the gap with reality.
arXiv Detail & Related papers (2021-10-01T20:48:43Z) - Evolved Neuromorphic Control for High Speed Divergence-based Landings of
MAVs [0.0]
We develop spiking neural networks for controlling landings of micro air vehicles.
We demonstrate that the resulting neuromorphic controllers transfer robustly from a simulation to the real world.
To the best of our knowledge, this work is the first to integrate spiking neural networks in the control loop of a real-world flying robot.
arXiv Detail & Related papers (2020-03-06T10:19:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.