CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM
System using Contrast Maximization
- URL: http://arxiv.org/abs/2403.08119v1
- Date: Tue, 12 Mar 2024 23:05:10 GMT
- Title: CMax-SLAM: Event-based Rotational-Motion Bundle Adjustment and SLAM
System using Contrast Maximization
- Authors: Shuang Guo and Guillermo Gallego
- Abstract summary: Event cameras are bio-inspired visual sensors that capture pixel-wise intensity changes and output asynchronous event streams.
This paper considers the problem of rotational motion estimation using event cameras.
Several event-based rotation estimation methods have been developed in the past decade, but their performance has not been evaluated.
- Score: 14.771885020122062
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Event cameras are bio-inspired visual sensors that capture pixel-wise
intensity changes and output asynchronous event streams. They show great
potential over conventional cameras to handle challenging scenarios in robotics
and computer vision, such as high-speed and high dynamic range. This paper
considers the problem of rotational motion estimation using event cameras.
Several event-based rotation estimation methods have been developed in the past
decade, but their performance has not been evaluated and compared under unified
criteria yet. In addition, these prior works do not consider a global
refinement step. To this end, we conduct a systematic study of this problem
with two objectives in mind: summarizing previous works and presenting our own
solution. First, we compare prior works both theoretically and experimentally.
Second, we propose the first event-based rotation-only bundle adjustment (BA)
approach. We formulate it leveraging the state-of-the-art Contrast Maximization
(CMax) framework, which is principled and avoids the need to convert events
into frames. Third, we use the proposed BA to build CMax-SLAM, the first
event-based rotation-only SLAM system comprising a front-end and a back-end.
Our BA is able to run both offline (trajectory smoothing) and online (CMax-SLAM
back-end). To demonstrate the performance and versatility of our method, we
present comprehensive experiments on synthetic and real-world datasets,
including indoor, outdoor and space scenarios. We discuss the pitfalls of
real-world evaluation and propose a proxy for the reprojection error as the
figure of merit to evaluate event-based rotation BA methods. We release the
source code and novel data sequences to benefit the community. We hope this
work leads to a better understanding and fosters further research on
event-based ego-motion estimation. Project page:
https://github.com/tub-rip/cmax_slam
Related papers
- Event-based Photometric Bundle Adjustment [12.504055397619727]
Event-based Photometric Bundle Adjustment (EPBA) is the first event-only photometric bundle adjustment method.
EPBA is effective in decreasing the photometric error (by up to 90%)
Experiments on modern high-resolution event cameras show the applicability of EPBA to panoramic imaging.
arXiv Detail & Related papers (2024-12-18T17:58:16Z) - Motion-prior Contrast Maximization for Dense Continuous-Time Motion Estimation [34.529280562470746]
We introduce a novel self-supervised loss combining the Contrast Maximization framework with a non-linear motion prior in the form of pixel-level trajectories.
Their effectiveness is demonstrated in two scenarios: In dense continuous-time motion estimation, our method improves the zero-shot performance of a synthetically trained model by 29%.
arXiv Detail & Related papers (2024-07-15T15:18:28Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - RD-VIO: Robust Visual-Inertial Odometry for Mobile Augmented Reality in
Dynamic Environments [55.864869961717424]
It is typically challenging for visual or visual-inertial odometry systems to handle the problems of dynamic scenes and pure rotation.
We design a novel visual-inertial odometry (VIO) system called RD-VIO to handle both of these problems.
arXiv Detail & Related papers (2023-10-23T16:30:39Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - HumanMAC: Masked Motion Completion for Human Motion Prediction [62.279925754717674]
Human motion prediction is a classical problem in computer vision and computer graphics.
Previous effects achieve great empirical performance based on an encoding-decoding style.
In this paper, we propose a novel framework from a new perspective.
arXiv Detail & Related papers (2023-02-07T18:34:59Z) - A Fast Geometric Regularizer to Mitigate Event Collapse in the Contrast
Maximization Framework [13.298845944779108]
We propose a novel, computationally efficient regularizer based on geometric principles to mitigate event collapse.
Experiments show that the proposed regularizer achieves state-of-the-art accuracy results, while its reduced computational complexity makes it two to four times faster than previous approaches.
arXiv Detail & Related papers (2022-12-14T17:22:48Z) - Secrets of Event-Based Optical Flow [13.298845944779108]
Event cameras respond to scene dynamics and offer advantages to estimate motion.
We develop a principled method to extend the Contrast Maximization framework to estimate optical flow from events alone.
Our method ranks first among unsupervised methods on the MVSEC benchmark, and is competitive on the DSEC benchmark.
arXiv Detail & Related papers (2022-07-20T16:40:38Z) - Event Collapse in Contrast Maximization Frameworks [13.298845944779108]
ContrastMax (C) is a framework that provides state-of-the-art results on several event-based computer vision, tasks such as optical ego-motion or flow estimation.
However, it may suffer from a problem called event collapse, which is an undesired solution where events are warped into too few pixels.
Our work demonstrates event collapse in its simplest form and proposes collapse metrics by using first principles of space-time deformation.
arXiv Detail & Related papers (2022-07-08T16:52:35Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.