EROAM: Event-based Camera Rotational Odometry and Mapping in Real-time
- URL: http://arxiv.org/abs/2411.11004v1
- Date: Sun, 17 Nov 2024 08:50:47 GMT
- Title: EROAM: Event-based Camera Rotational Odometry and Mapping in Real-time
- Authors: Wanli Xing, Shijie Lin, Linhan Yang, Zeqing Zhang, Yanjun Du, Maolin Lei, Yipeng Pan, Jia Pan,
- Abstract summary: EROAM is a novel event-based rotational odometry and mapping system that achieves real-time, accurate camera estimation.
We show that EROAM significantly outperforms state-of-the-art methods in terms of accuracy, robustness, and computational efficiency.
- Score: 14.989905816510698
- License:
- Abstract: This paper presents EROAM, a novel event-based rotational odometry and mapping system that achieves real-time, accurate camera rotation estimation. Unlike existing approaches that rely on event generation models or contrast maximization, EROAM employs a spherical event representation by projecting events onto a unit sphere and introduces Event Spherical Iterative Closest Point (ES-ICP), a novel geometric optimization framework designed specifically for event camera data. The spherical representation simplifies rotational motion formulation while enabling continuous mapping for enhanced spatial resolution. Combined with parallel point-to-line optimization, EROAM achieves efficient computation without compromising accuracy. Extensive experiments on both synthetic and real-world datasets show that EROAM significantly outperforms state-of-the-art methods in terms of accuracy, robustness, and computational efficiency. Our method maintains consistent performance under challenging conditions, including high angular velocities and extended sequences, where other methods often fail or show significant drift. Additionally, EROAM produces high-quality panoramic reconstructions with preserved fine structural details.
Related papers
- ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - IMU-Aided Event-based Stereo Visual Odometry [7.280676899773076]
We improve our previous direct pipeline textitEvent-based Stereo Visual Odometry in terms of accuracy and efficiency.
To speed up the mapping operation, we propose an efficient strategy of edge-pixel sampling according to the local dynamics of events.
We release our pipeline as an open-source software for future research in this field.
arXiv Detail & Related papers (2024-05-07T07:19:25Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Generalizing Event-Based Motion Deblurring in Real-World Scenarios [62.995994797897424]
Event-based motion deblurring has shown promising results by exploiting low-latency events.
We propose a scale-aware network that allows flexible input spatial scales and enables learning from different temporal scales of motion blur.
A two-stage self-supervised learning scheme is then developed to fit real-world data distribution.
arXiv Detail & Related papers (2023-08-11T04:27:29Z) - TransPoser: Transformer as an Optimizer for Joint Object Shape and Pose
Estimation [25.395619346823715]
We propose a novel method for joint estimation of shape and pose of rigid objects from their sequentially observed RGB-D images.
We introduce Deep Directional Distance Function (DeepDDF), a neural network that directly outputs the depth image of an object given the camera viewpoint and viewing direction.
We formulate the joint estimation itself as a Transformer which we refer to as TransPoser.
arXiv Detail & Related papers (2023-03-23T17:46:54Z) - Globally Optimal Event-Based Divergence Estimation for Ventral Landing [55.29096494880328]
Event sensing is a major component in bio-inspired flight guidance and control systems.
We explore the usage of event cameras for predicting time-to-contact with the surface during ventral landing.
This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing.
Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value.
arXiv Detail & Related papers (2022-09-27T06:00:52Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Motion Deblurring with Real Events [50.441934496692376]
We propose an end-to-end learning framework for event-based motion deblurring in a self-supervised manner.
Real-world events are exploited to alleviate the performance degradation caused by data inconsistency.
arXiv Detail & Related papers (2021-09-28T13:11:44Z) - Leveraging Spatial and Photometric Context for Calibrated Non-Lambertian
Photometric Stereo [61.6260594326246]
We introduce an efficient fully-convolutional architecture that can leverage both spatial and photometric context simultaneously.
Using separable 4D convolutions and 2D heat-maps reduces the size and makes more efficient.
arXiv Detail & Related papers (2021-03-22T18:06:58Z) - Spatiotemporal Registration for Event-based Visual Odometry [40.02502611087858]
A useful application of event sensing is visual odometry, especially in settings that require high-temporal resolution.
We propose large registration as a compelling technique for event-based rotational motion estimation.
We also contribute a new event dataset for visual odometry, where motion sequences with large velocity variations were acquired using a high-precision robot arm.
arXiv Detail & Related papers (2021-03-10T09:23:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.