Event-based Stereo Visual Odometry
- URL: http://arxiv.org/abs/2007.15548v2
- Date: Mon, 22 Feb 2021 14:52:21 GMT
- Title: Event-based Stereo Visual Odometry
- Authors: Yi Zhou, Guillermo Gallego, Shaojie Shen
- Abstract summary: We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
- Score: 42.77238738150496
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event-based cameras are bio-inspired vision sensors whose pixels work
independently from each other and respond asynchronously to brightness changes,
with microsecond resolution. Their advantages make it possible to tackle
challenging scenarios in robotics, such as high-speed and high dynamic range
scenes. We present a solution to the problem of visual odometry from the data
acquired by a stereo event-based camera rig. Our system follows a parallel
tracking-and-mapping approach, where novel solutions to each subproblem (3D
reconstruction and camera pose estimation) are developed with two objectives in
mind: being principled and efficient, for real-time operation with commodity
hardware. To this end, we seek to maximize the spatio-temporal consistency of
stereo event-based data while using a simple and efficient representation.
Specifically, the mapping module builds a semi-dense 3D map of the scene by
fusing depth estimates from multiple local viewpoints (obtained by
spatio-temporal consistency) in a probabilistic fashion. The tracking module
recovers the pose of the stereo rig by solving a registration problem that
naturally arises due to the chosen map and event data representation.
Experiments on publicly available datasets and on our own recordings
demonstrate the versatility of the proposed method in natural scenes with
general 6-DoF motion. The system successfully leverages the advantages of
event-based cameras to perform visual odometry in challenging illumination
conditions, such as low-light and high dynamic range, while running in
real-time on a standard CPU. We release the software and dataset under an open
source licence to foster research in the emerging topic of event-based SLAM.
Related papers
- EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - MonST3R: A Simple Approach for Estimating Geometry in the Presence of Motion [118.74385965694694]
We present Motion DUSt3R (MonST3R), a novel geometry-first approach that directly estimates per-timestep geometry from dynamic scenes.
By simply estimating a pointmap for each timestep, we can effectively adapt DUST3R's representation, previously only used for static scenes, to dynamic scenes.
We show that by posing the problem as a fine-tuning task, identifying several suitable datasets, and strategically training the model on this limited data, we can surprisingly enable the model to handle dynamics.
arXiv Detail & Related papers (2024-10-04T18:00:07Z) - IMU-Aided Event-based Stereo Visual Odometry [7.280676899773076]
We improve our previous direct pipeline textitEvent-based Stereo Visual Odometry in terms of accuracy and efficiency.
To speed up the mapping operation, we propose an efficient strategy of edge-pixel sampling according to the local dynamics of events.
We release our pipeline as an open-source software for future research in this field.
arXiv Detail & Related papers (2024-05-07T07:19:25Z) - Cross-Modal Semi-Dense 6-DoF Tracking of an Event Camera in Challenging
Conditions [29.608665442108727]
Event-based cameras are bio-inspired visual sensors that perform well in HDR conditions and have high temporal resolution.
The present work demonstrates the feasibility of purely event-based tracking if an alternative sensor is permitted for mapping.
The method relies on geometric 3D-2D registration of semi-dense maps and events, and achieves highly reliable and accurate cross-modal tracking results.
arXiv Detail & Related papers (2024-01-16T01:48:45Z) - On the Generation of a Synthetic Event-Based Vision Dataset for
Navigation and Landing [69.34740063574921]
This paper presents a methodology for generating event-based vision datasets from optimal landing trajectories.
We construct sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility.
We demonstrate that the pipeline can generate realistic event-based representations of surface features by constructing a dataset of 500 trajectories.
arXiv Detail & Related papers (2023-08-01T09:14:20Z) - Video Frame Interpolation with Stereo Event and Intensity Camera [40.07341828127157]
We propose a novel Stereo Event-based VFI network (SE-VFI-Net) to generate high-quality intermediate frames.
We exploit the fused features accomplishing accurate optical flow and disparity estimation.
Our proposed SEVFI-Net outperforms state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2023-07-17T04:02:00Z) - PL-EVIO: Robust Monocular Event-based Visual Inertial Odometry with
Point and Line Features [3.6355269783970394]
Event cameras are motion-activated sensors that capture pixel-level illumination changes instead of the intensity image with a fixed frame rate.
We propose a robust, high-accurate, and real-time optimization-based monocular event-based visual-inertial odometry (VIO) method.
arXiv Detail & Related papers (2022-09-25T06:14:12Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Multi-View Photometric Stereo: A Robust Solution and Benchmark Dataset
for Spatially Varying Isotropic Materials [65.95928593628128]
We present a method to capture both 3D shape and spatially varying reflectance with a multi-view photometric stereo technique.
Our algorithm is suitable for perspective cameras and nearby point light sources.
arXiv Detail & Related papers (2020-01-18T12:26:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.