Real-time 6-DoF Pose Estimation by an Event-based Camera using Active
LED Markers
- URL: http://arxiv.org/abs/2310.16618v1
- Date: Wed, 25 Oct 2023 13:14:12 GMT
- Title: Real-time 6-DoF Pose Estimation by an Event-based Camera using Active
LED Markers
- Authors: Gerald Ebmer, Adam Loch, Minh Nhat Vu, Germain Haessig, Roberto Mecca,
Markus Vincze, Christian Hartl-Nesic, and Andreas Kugi
- Abstract summary: This paper proposes an event-based pose estimation system using active LED markers (ALM) for fast and accurate pose estimation.
The proposed algorithm is able to operate in real time with a latency below SI0.5millisecond while maintaining output rates of SI3kilo hertz.
- Score: 12.932177576177281
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Real-time applications for autonomous operations depend largely on fast and
robust vision-based localization systems. Since image processing tasks require
processing large amounts of data, the computational resources often limit the
performance of other processes. To overcome this limitation, traditional
marker-based localization systems are widely used since they are easy to
integrate and achieve reliable accuracy. However, classical marker-based
localization systems significantly depend on standard cameras with low frame
rates, which often lack accuracy due to motion blur. In contrast, event-based
cameras provide high temporal resolution and a high dynamic range, which can be
utilized for fast localization tasks, even under challenging visual conditions.
This paper proposes a simple but effective event-based pose estimation system
using active LED markers (ALM) for fast and accurate pose estimation. The
proposed algorithm is able to operate in real time with a latency below
\SI{0.5}{\milli\second} while maintaining output rates of \SI{3}{\kilo \hertz}.
Experimental results in static and dynamic scenarios are presented to
demonstrate the performance of the proposed approach in terms of computational
speed and absolute accuracy, using the OptiTrack system as the basis for
measurement.
Related papers
- ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Generalizing Event-Based Motion Deblurring in Real-World Scenarios [62.995994797897424]
Event-based motion deblurring has shown promising results by exploiting low-latency events.
We propose a scale-aware network that allows flexible input spatial scales and enables learning from different temporal scales of motion blur.
A two-stage self-supervised learning scheme is then developed to fit real-world data distribution.
arXiv Detail & Related papers (2023-08-11T04:27:29Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - Fast Event-based Optical Flow Estimation by Triplet Matching [13.298845944779108]
Event cameras offer advantages over traditional cameras (low latency, high dynamic range, low power, etc.)
Optical flow estimation methods that work on packets of events trade off speed for accuracy.
We propose a novel optical flow estimation scheme based on triplet matching.
arXiv Detail & Related papers (2022-12-23T09:12:16Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Event-Based high-speed low-latency fiducial marker tracking [15.052022635853799]
We propose an end-to-end pipeline for real-time, low latency, 6 degrees-of-freedom pose estimation of fiducial markers.
We employ the high-speed abilities of event-based sensors to directly refine the spatial transformation.
This approach allows us to achieve pose estimation at a rate up to 156kHz, while only relying on CPU resources.
arXiv Detail & Related papers (2021-10-12T08:34:31Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z) - Object-based Illumination Estimation with Rendering-aware Neural
Networks [56.01734918693844]
We present a scheme for fast environment light estimation from the RGBD appearance of individual objects and their local image areas.
With the estimated lighting, virtual objects can be rendered in AR scenarios with shading that is consistent to the real scene.
arXiv Detail & Related papers (2020-08-06T08:23:19Z) - Event-based Stereo Visual Odometry [42.77238738150496]
We present a solution to the problem of visual odometry from the data acquired by a stereo event-based camera rig.
We seek to maximize thetemporal consistency of stereo event-based data while using a simple and efficient representation.
arXiv Detail & Related papers (2020-07-30T15:53:28Z) - Event-based Asynchronous Sparse Convolutional Networks [54.094244806123235]
Event cameras are bio-inspired sensors that respond to per-pixel brightness changes in the form of asynchronous and sparse "events"
We present a general framework for converting models trained on synchronous image-like event representations into asynchronous models with identical output.
We show both theoretically and experimentally that this drastically reduces the computational complexity and latency of high-capacity, synchronous neural networks.
arXiv Detail & Related papers (2020-03-20T08:39:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.