Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras
- URL: http://arxiv.org/abs/2207.01009v1
- Date: Sun, 3 Jul 2022 11:05:45 GMT
- Title: Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras
- Authors: Kevin Ta, David Bruggemann, Tim Br\"odermann, Christos Sakaridis, Luc
Van Gool
- Abstract summary: This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
- Score: 67.84498757689776
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Despite significant academic and corporate efforts, autonomous driving under
adverse visual conditions still proves challenging. As neuromorphic technology
has matured, its application to robotics and autonomous vehicle systems has
become an area of active research. Low-light and latency-demanding situations
can benefit. To enable event cameras to operate alongside staple sensors like
lidar in perception tasks, we propose a direct, temporally-decoupled
calibration method between event cameras and lidars. The high dynamic range and
low-light operation of event cameras are exploited to directly register lidar
laser returns, allowing information-based correlation methods to optimize for
the 6-DoF extrinsic calibration between the two sensors. This paper presents
the first direct calibration method between event cameras and lidars, removing
dependencies on frame-based camera intermediaries and/or highly-accurate hand
measurements. Code will be made publicly available.
Related papers
- Microsaccade-inspired Event Camera for Robotics [42.27082276343167]
We design an event-based perception system capable of simultaneously maintaining low reaction time and stable texture.
The geometrical optics of the rotating wedge prism allows for algorithmic compensation of the additional rotational motion.
Various real-world experiments demonstrate the potential of the system to facilitate robotics perception both for low-level and high-level vision tasks.
arXiv Detail & Related papers (2024-05-28T02:49:46Z) - E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event
Cameras [34.71767308204867]
We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
arXiv Detail & Related papers (2023-06-15T12:16:38Z) - EasyHeC: Accurate and Automatic Hand-eye Calibration via Differentiable
Rendering and Space Exploration [49.90228618894857]
We introduce a new approach to hand-eye calibration called EasyHeC, which is markerless, white-box, and delivers superior accuracy and robustness.
We propose to use two key technologies: differentiable rendering-based camera pose optimization and consistency-based joint space exploration.
Our evaluation demonstrates superior performance in synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-02T03:49:54Z) - LCE-Calib: Automatic LiDAR-Frame/Event Camera Extrinsic Calibration With
A Globally Optimal Solution [10.117923901732743]
The combination of LiDARs and cameras enables a mobile robot to perceive environments with multi-modal data.
Traditional frame cameras are sensitive to changing illumination conditions, motivating us to introduce novel event cameras.
This paper proposes an automatic checkerboard-based approach to calibrate extrinsics between a LiDAR and a frame/event camera.
arXiv Detail & Related papers (2023-03-17T08:07:56Z) - Extrinsic Camera Calibration with Semantic Segmentation [60.330549990863624]
We present an extrinsic camera calibration approach that automatizes the parameter estimation by utilizing semantic segmentation information.
Our approach relies on a coarse initial measurement of the camera pose and builds on lidar sensors mounted on a vehicle.
We evaluate our method on simulated and real-world data to demonstrate low error measurements in the calibration results.
arXiv Detail & Related papers (2022-08-08T07:25:03Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - LIF-Seg: LiDAR and Camera Image Fusion for 3D LiDAR Semantic
Segmentation [78.74202673902303]
We propose a coarse-tofine LiDAR and camera fusion-based network (termed as LIF-Seg) for LiDAR segmentation.
The proposed method fully utilizes the contextual information of images and introduces a simple but effective early-fusion strategy.
The cooperation of these two components leads to the success of the effective camera-LiDAR fusion.
arXiv Detail & Related papers (2021-08-17T08:53:11Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.