Tight Fusion of Events and Inertial Measurements for Direct Velocity
Estimation
- URL: http://arxiv.org/abs/2401.09296v1
- Date: Wed, 17 Jan 2024 15:56:57 GMT
- Title: Tight Fusion of Events and Inertial Measurements for Direct Velocity
Estimation
- Authors: Wanting Xu, Xin Peng and Laurent Kneip
- Abstract summary: We propose a novel solution to tight visual-inertial fusion directly at the level of first-order kinematics by employing a dynamic vision sensor instead of a normal camera.
We demonstrate how velocity estimates in highly dynamic situations can be obtained over short time intervals.
Experiments on both simulated and real data demonstrate that the proposed tight event-inertial fusion leads to continuous and reliable velocity estimation.
- Score: 20.002238735553792
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Traditional visual-inertial state estimation targets absolute camera poses
and spatial landmark locations while first-order kinematics are typically
resolved as an implicitly estimated sub-state. However, this poses a risk in
velocity-based control scenarios, as the quality of the estimation of
kinematics depends on the stability of absolute camera and landmark coordinates
estimation. To address this issue, we propose a novel solution to tight
visual-inertial fusion directly at the level of first-order kinematics by
employing a dynamic vision sensor instead of a normal camera. More
specifically, we leverage trifocal tensor geometry to establish an incidence
relation that directly depends on events and camera velocity, and demonstrate
how velocity estimates in highly dynamic situations can be obtained over short
time intervals. Noise and outliers are dealt with using a nested two-layer
RANSAC scheme. Additionally, smooth velocity signals are obtained from a tight
fusion with pre-integrated inertial signals using a sliding window optimizer.
Experiments on both simulated and real data demonstrate that the proposed tight
event-inertial fusion leads to continuous and reliable velocity estimation in
highly dynamic scenarios independently of absolute coordinates. Furthermore, in
extreme cases, it achieves more stable and more accurate estimation of
kinematics than traditional, point-position-based visual-inertial odometry.
Related papers
- DATAP-SfM: Dynamic-Aware Tracking Any Point for Robust Structure from Motion in the Wild [85.03973683867797]
This paper proposes a concise, elegant, and robust pipeline to estimate smooth camera trajectories and obtain dense point clouds for casual videos in the wild.
We show that the proposed method achieves state-of-the-art performance in terms of camera pose estimation even in complex dynamic challenge scenes.
arXiv Detail & Related papers (2024-11-20T13:01:16Z) - EVIT: Event-based Visual-Inertial Tracking in Semi-Dense Maps Using Windowed Nonlinear Optimization [19.915476815328294]
Event cameras are an interesting visual exteroceptive sensor that reacts to brightness changes rather than integrating absolute image intensities.
This paper proposes the addition of inertial signals in order to robustify the estimation.
Our evaluation focuses on a diverse set of real world sequences and comprises a comparison of our proposed method against a purely event-based alternative running at different rates.
arXiv Detail & Related papers (2024-08-02T16:24:55Z) - Event-Aided Time-to-Collision Estimation for Autonomous Driving [28.13397992839372]
We present a novel method that estimates the time to collision using a neuromorphic event-based camera.
The proposed algorithm consists of a two-step approach for efficient and accurate geometric model fitting on event data.
Experiments on both synthetic and real data demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2024-07-10T02:37:36Z) - Event-Based Visual Odometry on Non-Holonomic Ground Vehicles [20.847519645153337]
Event-based visual odometry is shown to be reliable and robust in challenging illumination scenarios.
Our algorithm achieves accurate estimates of the vehicle's rotational velocity and thus results that are comparable to the delta rotations obtained by frame-based sensors under normal conditions.
arXiv Detail & Related papers (2024-01-17T16:52:20Z) - A 5-Point Minimal Solver for Event Camera Relative Motion Estimation [47.45081895021988]
We introduce a novel minimal 5-point solver that estimates line parameters and linear camera velocity projections, which can be fused into a single, averaged linear velocity when considering multiple lines.
Our method consistently achieves a 100% success rate in estimating linear velocity where existing closed-form solvers only achieve between 23% and 70%.
arXiv Detail & Related papers (2023-09-29T08:30:18Z) - Correlating sparse sensing for large-scale traffic speed estimation: A
Laplacian-enhanced low-rank tensor kriging approach [76.45949280328838]
We propose a Laplacian enhanced low-rank tensor (LETC) framework featuring both lowrankness and multi-temporal correlations for large-scale traffic speed kriging.
We then design an efficient solution algorithm via several effective numeric techniques to scale up the proposed model to network-wide kriging.
arXiv Detail & Related papers (2022-10-21T07:25:57Z) - ParticleSfM: Exploiting Dense Point Trajectories for Localizing Moving
Cameras in the Wild [57.37891682117178]
We present a robust dense indirect structure-from-motion method for videos that is based on dense correspondence from pairwise optical flow.
A novel neural network architecture is proposed for processing irregular point trajectory data.
Experiments on MPI Sintel dataset show that our system produces significantly more accurate camera trajectories.
arXiv Detail & Related papers (2022-07-19T09:19:45Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Continuous Event-Line Constraint for Closed-Form Velocity Initialization [0.0]
Event cameras trigger events asynchronously and independently upon a sufficient change of the logarithmic brightness level.
We propose the continuous event-line constraint, which relies on a constant-velocity motion assumption as well as trifocal geometry in order to express a relationship between line observations given by event clusters as well as first-order camera dynamics.
arXiv Detail & Related papers (2021-09-09T14:39:56Z) - End-to-end Learning for Inter-Vehicle Distance and Relative Velocity
Estimation in ADAS with a Monocular Camera [81.66569124029313]
We propose a camera-based inter-vehicle distance and relative velocity estimation method based on end-to-end training of a deep neural network.
The key novelty of our method is the integration of multiple visual clues provided by any two time-consecutive monocular frames.
We also propose a vehicle-centric sampling mechanism to alleviate the effect of perspective distortion in the motion field.
arXiv Detail & Related papers (2020-06-07T08:18:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.