AsynEIO: Asynchronous Monocular Event-Inertial Odometry Using Gaussian Process Regression
- URL: http://arxiv.org/abs/2411.12175v1
- Date: Tue, 19 Nov 2024 02:39:57 GMT
- Title: AsynEIO: Asynchronous Monocular Event-Inertial Odometry Using Gaussian Process Regression
- Authors: Zhixiang Wang, Xudong Li, Yizhai Zhang, Fan Zhang, Panfeng,
- Abstract summary: We introduce a monocular event-inertial odometry method called AsynEIO, designed to fuse asynchronous event and inertial data.
We show that AsynEIO outperforms existing methods, especially in high-speed and low-illumination scenarios.
- Score: 7.892365588256595
- License:
- Abstract: Event cameras, when combined with inertial sensors, show significant potential for motion estimation in challenging scenarios, such as high-speed maneuvers and low-light environments. There are many methods for producing such estimations, but most boil down to a synchronous discrete-time fusion problem. However, the asynchronous nature of event cameras and their unique fusion mechanism with inertial sensors remain underexplored. In this paper, we introduce a monocular event-inertial odometry method called AsynEIO, designed to fuse asynchronous event and inertial data within a unified Gaussian Process (GP) regression framework. Our approach incorporates an event-driven frontend that tracks feature trajectories directly from raw event streams at a high temporal resolution. These tracked feature trajectories, along with various inertial factors, are integrated into the same GP regression framework to enable asynchronous fusion. With deriving analytical residual Jacobians and noise models, our method constructs a factor graph that is iteratively optimized and pruned using a sliding-window optimizer. Comparative assessments highlight the performance of different inertial fusion strategies, suggesting optimal choices for varying conditions. Experimental results on both public datasets and our own event-inertial sequences indicate that AsynEIO outperforms existing methods, especially in high-speed and low-illumination scenarios.
Related papers
- Event-Aided Time-to-Collision Estimation for Autonomous Driving [28.13397992839372]
We present a novel method that estimates the time to collision using a neuromorphic event-based camera.
The proposed algorithm consists of a two-step approach for efficient and accurate geometric model fitting on event data.
Experiments on both synthetic and real data demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2024-07-10T02:37:36Z) - ALERT-Transformer: Bridging Asynchronous and Synchronous Machine Learning for Real-Time Event-based Spatio-Temporal Data [8.660721666999718]
We propose a hybrid pipeline composed of asynchronous sensing and synchronous processing.
We achieve performances state-of-the-art with a lower latency than competitors.
arXiv Detail & Related papers (2024-02-02T13:17:19Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - A Diffusion-Model of Joint Interactive Navigation [14.689298253430568]
We present DJINN - a diffusion based method of generating traffic scenarios.
Our approach jointly diffuses the trajectories of all agents, conditioned on a flexible set of state observations from the past, present, or future.
We show how DJINN flexibly enables direct test-time sampling from a variety of valuable conditional distributions.
arXiv Detail & Related papers (2023-09-21T22:10:20Z) - STGlow: A Flow-based Generative Framework with Dual Graphormer for
Pedestrian Trajectory Prediction [22.553356096143734]
We propose a novel generative flow based framework with dual graphormer for pedestrian trajectory prediction (STGlow)
Our method can more precisely model the underlying data distribution by optimizing the exact log-likelihood of motion behaviors.
Experimental results on several benchmarks demonstrate that our method achieves much better performance compared to previous state-of-the-art approaches.
arXiv Detail & Related papers (2022-11-21T07:29:24Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - Globally Optimal Event-Based Divergence Estimation for Ventral Landing [55.29096494880328]
Event sensing is a major component in bio-inspired flight guidance and control systems.
We explore the usage of event cameras for predicting time-to-contact with the surface during ventral landing.
This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing.
Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value.
arXiv Detail & Related papers (2022-09-27T06:00:52Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z) - Event-based Asynchronous Sparse Convolutional Networks [54.094244806123235]
Event cameras are bio-inspired sensors that respond to per-pixel brightness changes in the form of asynchronous and sparse "events"
We present a general framework for converting models trained on synchronous image-like event representations into asynchronous models with identical output.
We show both theoretically and experimentally that this drastically reduces the computational complexity and latency of high-capacity, synchronous neural networks.
arXiv Detail & Related papers (2020-03-20T08:39:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.