Visual Odometry with an Event Camera Using Continuous Ray Warping and
Volumetric Contrast Maximization
- URL: http://arxiv.org/abs/2107.03011v1
- Date: Wed, 7 Jul 2021 04:32:57 GMT
- Title: Visual Odometry with an Event Camera Using Continuous Ray Warping and
Volumetric Contrast Maximization
- Authors: Yifu Wang, Jiaqi Yang, Xin Peng, Peng Wu, Ling Gao, Kun Huang, Jiaben
Chen, Laurent Kneip
- Abstract summary: We present a new solution to tracking and mapping with an event camera.
The motion of the camera contains both rotation and translation, and the displacements happen in an arbitrarily structured environment.
We introduce a new solution to this problem by performing contrast in 3D.
The practical validity of our approach is supported by an application to AGV motion estimation and 3D reconstruction with a single vehicle-mounted event camera.
- Score: 31.627936023222052
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a new solution to tracking and mapping with an event camera. The
motion of the camera contains both rotation and translation, and the
displacements happen in an arbitrarily structured environment. As a result, the
image matching may no longer be represented by a low-dimensional homographic
warping, thus complicating an application of the commonly used Image of Warped
Events (IWE). We introduce a new solution to this problem by performing
contrast maximization in 3D. The 3D location of the rays cast for each event is
smoothly varied as a function of a continuous-time motion parametrization, and
the optimal parameters are found by maximizing the contrast in a volumetric ray
density field. Our method thus performs joint optimization over motion and
structure. The practical validity of our approach is supported by an
application to AGV motion estimation and 3D reconstruction with a single
vehicle-mounted event camera. The method approaches the performance obtained
with regular cameras, and eventually outperforms in challenging visual
conditions.
Related papers
- EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Gaussian Splatting on the Move: Blur and Rolling Shutter Compensation for Natural Camera Motion [25.54868552979793]
We present a method that adapts to camera motion and allows high-quality scene reconstruction with handheld video data.
Our results with both synthetic and real data demonstrate superior performance in mitigating camera motion over existing methods.
arXiv Detail & Related papers (2024-03-20T06:19:41Z) - Density Invariant Contrast Maximization for Neuromorphic Earth
Observations [55.970609838687864]
Contrast (CMax) techniques are widely used in event-based vision systems to estimate the motion parameters of the camera and generate high-contrast images.
These techniques are noise-intolerance and suffer from the multiple extrema problem which arises when the scene contains more noisy events than structure.
Our proposed solution overcomes the multiple extrema and noise-intolerance problems by correcting the warped event before calculating the contrast.
arXiv Detail & Related papers (2023-04-27T12:17:40Z) - Globally-Optimal Contrast Maximisation for Event Cameras [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in challenging illumination with high temporal resolution.
The pixels of an event camera operate independently and asynchronously.
The flow of events is modelled by a general homographic warping in a space-time volume.
arXiv Detail & Related papers (2022-06-10T14:06:46Z) - Event-aided Direct Sparse Odometry [54.602311491827805]
We introduce EDS, a direct monocular visual odometry using events and frames.
Our algorithm leverages the event generation model to track the camera motion in the blind time between frames.
EDS is the first method to perform 6-DOF VO using events and frames with a direct approach.
arXiv Detail & Related papers (2022-04-15T20:40:29Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Motion-from-Blur: 3D Shape and Motion Estimation of Motion-blurred
Objects in Videos [115.71874459429381]
We propose a method for jointly estimating the 3D motion, 3D shape, and appearance of highly motion-blurred objects from a video.
Experiments on benchmark datasets demonstrate that our method outperforms previous methods for fast moving object deblurring and 3D reconstruction.
arXiv Detail & Related papers (2021-11-29T11:25:14Z) - Spatiotemporal Bundle Adjustment for Dynamic 3D Human Reconstruction in
the Wild [49.672487902268706]
We present a framework that jointly estimates camera temporal alignment and 3D point triangulation.
We reconstruct 3D motion trajectories of human bodies in events captured by multiple unsynchronized and unsynchronized video cameras.
arXiv Detail & Related papers (2020-07-24T23:50:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.