An Event-based Algorithm for Simultaneous 6-DOF Camera Pose Tracking and Mapping
- URL: http://arxiv.org/abs/2301.00618v3
- Date: Wed, 26 Jun 2024 02:42:30 GMT
- Title: An Event-based Algorithm for Simultaneous 6-DOF Camera Pose Tracking and Mapping
- Authors: Masoud Dayani Najafabadi, Mohammad Reza Ahmadzadeh,
- Abstract summary: Event cameras can output compact visual data based on a change in the intensity in each pixel location asynchronously.
We propose an inertial version of the event-only pipeline to assess its capabilities.
We show it can produce comparable or more accurate results provided the map estimate is reliable.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Compared to regular cameras, Dynamic Vision Sensors or Event Cameras can output compact visual data based on a change in the intensity in each pixel location asynchronously. In this paper, we study the application of current image-based SLAM techniques to these novel sensors. To this end, the information in adaptively selected event windows is processed to form motion-compensated images. These images are then used to reconstruct the scene and estimate the 6-DOF pose of the camera. We also propose an inertial version of the event-only pipeline to assess its capabilities. We compare the results of different configurations of the proposed algorithm against the ground truth for sequences of two publicly available event datasets. We also compare the results of the proposed event-inertial pipeline with the state-of-the-art and show it can produce comparable or more accurate results provided the map estimate is reliable.
Related papers
- ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Evaluating Image-Based Face and Eye Tracking with Event Cameras [9.677797822200965]
Event Cameras, also known as Neuromorphic sensors, capture changes in local light intensity at the pixel level, producing asynchronously generated data termed events''
This data format mitigates common issues observed in conventional cameras, like under-sampling when capturing fast-moving objects.
We evaluate the viability of integrating conventional algorithms with event-based data, transformed into a frame format.
arXiv Detail & Related papers (2024-08-19T20:27:08Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - On the Generation of a Synthetic Event-Based Vision Dataset for
Navigation and Landing [69.34740063574921]
This paper presents a methodology for generating event-based vision datasets from optimal landing trajectories.
We construct sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility.
We demonstrate that the pipeline can generate realistic event-based representations of surface features by constructing a dataset of 500 trajectories.
arXiv Detail & Related papers (2023-08-01T09:14:20Z) - Ego-motion Estimation Based on Fusion of Images and Events [0.0]
Event camera is a novel bio-inspired vision sensor that outputs event stream.
We propose a novel data fusion algorithm called EAS to fuse conventional intensity images with the event stream.
arXiv Detail & Related papers (2022-07-12T15:10:28Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - VisEvent: Reliable Object Tracking via Collaboration of Frame and Event
Flows [93.54888104118822]
We propose a large-scale Visible-Event benchmark (termed VisEvent) due to the lack of a realistic and scaled dataset for this task.
Our dataset consists of 820 video pairs captured under low illumination, high speed, and background clutter scenarios.
Based on VisEvent, we transform the event flows into event images and construct more than 30 baseline methods.
arXiv Detail & Related papers (2021-08-11T03:55:12Z) - Tracking 6-DoF Object Motion from Events and Frames [0.0]
We propose a novel approach for 6 degree-of-freedom (6-DoF)object motion tracking that combines measurements of eventand frame-based cameras.
arXiv Detail & Related papers (2021-03-29T12:39:38Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Event-based visual place recognition with ensembles of temporal windows [29.6328152991222]
Event cameras are bio-inspired sensors capable of providing a continuous stream of events with low latency and high range.
We develop a novel ensemble-based scheme for combining temporal windows of varying lengths that are processed in parallel.
We show that our proposed ensemble scheme significantly outperforms all single-window baselines and conventional model-based ensembles.
arXiv Detail & Related papers (2020-05-22T05:33:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.