ASAP: Adaptive Scheme for Asynchronous Processing of Event-based Vision
Algorithms
- URL: http://arxiv.org/abs/2209.08597v1
- Date: Sun, 18 Sep 2022 16:28:29 GMT
- Title: ASAP: Adaptive Scheme for Asynchronous Processing of Event-based Vision
Algorithms
- Authors: Raul Tapia, Augusto G\'omez Egu\'iluz, Jos\'e Ramiro Mart\'inez-de
Dios, Anibal Ollero
- Abstract summary: Event cameras can capture pixel-level illumination changes with very high temporal resolution and dynamic range.
Two main approaches exist to feed the event-based processing algorithms: packaging the triggered events in event packages and sending them one-by-one as single events.
This paper presents ASAP, an adaptive scheme to manage the event stream through variable-size packages that accommodate the event package processing times.
- Score: 0.2580765958706853
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Event cameras can capture pixel-level illumination changes with very high
temporal resolution and dynamic range. They have received increasing research
interest due to their robustness to lighting conditions and motion blur. Two
main approaches exist in the literature to feed the event-based processing
algorithms: packaging the triggered events in event packages and sending them
one-by-one as single events. These approaches suffer limitations from either
processing overflow or lack of responsivity. Processing overflow is caused by
high event generation rates when the algorithm cannot process all the events in
real-time. Conversely, lack of responsivity happens in cases of low event
generation rates when the event packages are sent at too low frequencies. This
paper presents ASAP, an adaptive scheme to manage the event stream through
variable-size packages that accommodate to the event package processing times.
The experimental results show that ASAP is capable of feeding an asynchronous
event-by-event clustering algorithm in a responsive and efficient manner and at
the same time prevents overflow.
Related papers
- Event-ECC: Asynchronous Tracking of Events with Continuous Optimization [1.9446776999250501]
We propose a tracking algorithm that computes a 2D motion warp per single event, called event-ECC (eECC)
The computational burden of event-wise processing is alleviated through a lightweight version that benefits from incremental processing and updating scheme.
We report improvements in tracking accuracy and feature age over state-of-the-art event-based asynchronous trackers.
arXiv Detail & Related papers (2024-09-22T19:03:19Z) - Scalable Event-by-event Processing of Neuromorphic Sensory Signals With Deep State-Space Models [2.551844666707809]
Event-based sensors are well suited for real-time processing.
Current methods either collapse events into frames or cannot scale up when processing the event data directly event-by-event.
arXiv Detail & Related papers (2024-04-29T08:50:27Z) - Representation Learning on Event Stream via an Elastic Net-incorporated
Tensor Network [1.9515859963221267]
We present a novel representation method which can capture global correlations of all events in the event stream simultaneously.
Our method can achieve effective results in applications like filtering noise compared with the state-of-the-art methods.
arXiv Detail & Related papers (2024-01-16T02:51:47Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Graph-based Asynchronous Event Processing for Rapid Object Recognition [59.112755601918074]
Event cameras capture asynchronous events stream in which each event encodes pixel location, trigger time, and the polarity of the brightness changes.
We introduce a novel graph-based framework for event cameras, namely SlideGCN.
Our approach can efficiently process data event-by-event, unlock the low latency nature of events data while still maintaining the graph's structure internally.
arXiv Detail & Related papers (2023-08-28T08:59:57Z) - Learning to Super-Resolve Blurry Images with Events [62.61911224564196]
Super-Resolution from a single motion Blurred image (SRB) is a severely ill-posed problem due to the joint degradation of motion blurs and low spatial resolution.
We employ events to alleviate the burden of SRB and propose an Event-enhanced SRB (E-SRB) algorithm.
We show that the proposed eSL-Net++ outperforms state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2023-02-27T13:46:42Z) - Temporal Up-Sampling for Asynchronous Events [0.0]
In low-brightness or slow-moving scenes, events are often sparse and accompanied by noise.
We propose an event temporal up-sampling algorithm to generate more effective and reliable events.
Experimental results show that up-sampling events can provide more effective information and improve the performance of downstream tasks.
arXiv Detail & Related papers (2022-08-18T09:12:08Z) - Unifying Event Detection and Captioning as Sequence Generation via
Pre-Training [53.613265415703815]
We propose a unified pre-training and fine-tuning framework to enhance the inter-task association between event detection and captioning.
Our model outperforms the state-of-the-art methods, and can be further boosted when pre-trained on extra large-scale video-text data.
arXiv Detail & Related papers (2022-07-18T14:18:13Z) - Event Transformer [43.193463048148374]
Event camera's low power consumption and ability to capture microsecond brightness make it attractive for various computer vision tasks.
Existing event representation methods typically convert events into frames, voxel grids, or spikes for deep neural networks (DNNs)
This work introduces a novel token-based event representation, where each event is considered a fundamental processing unit termed an event-token.
arXiv Detail & Related papers (2022-04-11T15:05:06Z) - AEGNN: Asynchronous Event-based Graph Neural Networks [54.528926463775946]
Event-based Graph Neural Networks generalize standard GNNs to process events as "evolving"-temporal graphs.
AEGNNs are easily trained on synchronous inputs and can be converted to efficient, "asynchronous" networks at test time.
arXiv Detail & Related papers (2022-03-31T16:21:12Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.