Feature-based Event Stereo Visual Odometry
- URL: http://arxiv.org/abs/2107.04921v1
- Date: Sat, 10 Jul 2021 22:36:49 GMT
- Title: Feature-based Event Stereo Visual Odometry
- Authors: Antea Hadviger, Igor Cvi\v{s}i\'c, Ivan Markovi\'c, Sacha
Vra\v{z}i\'c, Ivan Petrovi\'c
- Abstract summary: We propose a novel stereo visual odometry method for event cameras based on feature detection and matching with careful feature management.
We evaluate the performance of the proposed method on two publicly available datasets: MVSEC sequences captured by an indoor flying drone and DSEC outdoor driving sequences.
- Score: 2.7298989068857487
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event-based cameras are biologically inspired sensors that output events,
i.e., asynchronous pixel-wise brightness changes in the scene. Their high
dynamic range and temporal resolution of a microsecond makes them more reliable
than standard cameras in environments of challenging illumination and in
high-speed scenarios, thus developing odometry algorithms based solely on event
cameras offers exciting new possibilities for autonomous systems and robots. In
this paper, we propose a novel stereo visual odometry method for event cameras
based on feature detection and matching with careful feature management, while
pose estimation is done by reprojection error minimization. We evaluate the
performance of the proposed method on two publicly available datasets: MVSEC
sequences captured by an indoor flying drone and DSEC outdoor driving
sequences. MVSEC offers accurate ground truth from motion capture, while for
DSEC, which does not offer ground truth, in order to obtain a reference
trajectory on the standard camera frames we used our SOFT visual odometry, one
of the highest ranking algorithms on the KITTI scoreboards. We compared our
method to the ESVO method, which is the first and still the only stereo event
odometry method, showing on par performance on the MVSEC sequences, while on
the DSEC dataset ESVO, unlike our method, was unable to handle outdoor driving
scenario with default parameters. Furthermore, two important advantages of our
method over ESVO are that it adapts tracking frequency to the asynchronous
event rate and does not require initialization.
Related papers
- ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - ES-PTAM: Event-based Stereo Parallel Tracking and Mapping [11.801511288805225]
Event cameras offer advantages to overcome the limitations of standard cameras.
We propose a novel event-based stereo VO system by combining two ideas.
We evaluate the system on five real-world datasets.
arXiv Detail & Related papers (2024-08-28T07:56:28Z) - Event-based Stereo Visual Odometry with Native Temporal Resolution via
Continuous-time Gaussian Process Regression [3.4447129363520332]
Event-based cameras capture individual visual changes in a scene at unique times.
It is often addressed in visual odometry pipelines by approximating temporally close measurements as occurring at one common time.
This paper presents a complete stereo VO pipeline that estimates directly with individual event-measurement times without requiring any grouping or approximation in the estimation state.
arXiv Detail & Related papers (2023-06-01T22:57:32Z) - PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with
Point and Line Features [3.6355269783970394]
Event cameras are motion-activated sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate.
We propose a robust, high-accurate, and real-time optimization-based monocular event-based visual-inertial odometry (VIO) method.
arXiv Detail & Related papers (2022-09-25T06:14:12Z) - Event-aided Direct Sparse Odometry [54.602311491827805]
We introduce EDS, a direct monocular visual odometry using events and frames.
Our algorithm leverages the event generation model to track the camera motion in the blind time between frames.
EDS is the first method to perform 6-DOF VO using events and frames with a direct approach.
arXiv Detail & Related papers (2022-04-15T20:40:29Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Bridging the Gap between Events and Frames through Unsupervised Domain
Adaptation [57.22705137545853]
We propose a task transfer method that allows models to be trained directly with labeled images and unlabeled event data.
We leverage the generative event model to split event features into content and motion features.
Our approach unlocks the vast amount of existing image datasets for the training of event-based neural networks.
arXiv Detail & Related papers (2021-09-06T17:31:37Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Unsupervised Feature Learning for Event Data: Direct vs Inverse Problem
Formulation [53.850686395708905]
Event-based cameras record an asynchronous stream of per-pixel brightness changes.
In this paper, we focus on single-layer architectures for representation learning from event data.
We show improvements of up to 9 % in the recognition accuracy compared to the state-of-the-art methods.
arXiv Detail & Related papers (2020-09-23T10:40:03Z) - Event-based Stereo Visual Odometry [42.77238738150496]
We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
arXiv Detail & Related papers (2020-07-30T15:53:28Z) - Asynchronous Tracking-by-Detection on Adaptive Time Surfaces for
Event-based Object Tracking [87.0297771292994]
We propose an Event-based Tracking-by-Detection (ETD) method for generic bounding box-based object tracking.
To achieve this goal, we present an Adaptive Time-Surface with Linear Time Decay (ATSLTD) event-to-frame conversion algorithm.
We compare the proposed ETD method with seven popular object tracking methods, that are based on conventional cameras or event cameras, and two variants of ETD.
arXiv Detail & Related papers (2020-02-13T15:58:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.