Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR
- URL: http://arxiv.org/abs/2306.13240v1
- Date: Thu, 22 Jun 2023 23:16:31 GMT
- Title: Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR
- Authors: Jack Borer, Jeremy Tschirner, Florian \"Olsner, Stefan Milz
- Abstract summary: An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions.
There is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data.
We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration.
- Score: 7.906477322731106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automated driving systems use multi-modal sensor suites to ensure the
reliable, redundant and robust perception of the operating domain, for example
camera and LiDAR. An accurate extrinsic calibration is required to fuse the
camera and LiDAR data into a common spatial reference frame required by
high-level perception functions. Over the life of the vehicle the value of the
extrinsic calibration can change due physical disturbances, introducing an
error into the high-level perception functions. Therefore there is a need for
continuous online extrinsic calibration algorithms which can automatically
update the value of the camera-LiDAR calibration during the life of the vehicle
using only sensor data.
We propose using mutual information between the camera image's depth
estimate, provided by commonly available monocular depth estimation networks,
and the LiDAR pointcloud's geometric distance as a optimization metric for
extrinsic calibration. Our method requires no calibration target, no ground
truth training data and no expensive offline optimization. We demonstrate our
algorithm's accuracy, precision, speed and self-diagnosis capability on the
KITTI-360 data set.
Related papers
- UniCal: Unified Neural Sensor Calibration [32.7372115947273]
Self-driving vehicles (SDVs) require accurate calibration of LiDARs and cameras to fuse sensor data accurately for autonomy.
Traditional calibration methods leverage fiducials captured in a controlled and structured scene and compute correspondences to optimize over.
We propose UniCal, a unified framework for effortlessly calibrating SDVs equipped with multiple LiDARs and cameras.
arXiv Detail & Related papers (2024-09-27T17:56:04Z) - YOCO: You Only Calibrate Once for Accurate Extrinsic Parameter in LiDAR-Camera Systems [0.5999777817331317]
In a multi-sensor fusion system composed of cameras and LiDAR, precise extrinsic calibration contributes to the system's long-term stability and accurate perception of the environment.
This paper proposes a novel fully automatic extrinsic calibration method for LiDAR-camera systems that circumvents the need for corresponding point registration.
arXiv Detail & Related papers (2024-07-25T13:44:49Z) - From Chaos to Calibration: A Geometric Mutual Information Approach to
Target-Free Camera LiDAR Extrinsic Calibration [4.378156825150505]
We propose a target free extrinsic calibration algorithm that requires no ground truth training data.
We demonstrate our proposed improvement using the KITTI and KITTI-360 fisheye data set.
arXiv Detail & Related papers (2023-11-03T13:30:31Z) - Automated Automotive Radar Calibration With Intelligent Vehicles [73.15674960230625]
We present an approach for automated and geo-referenced calibration of automotive radar sensors.
Our method does not require external modifications of a vehicle and instead uses the location data obtained from automated vehicles.
Our evaluation on data from a real testing site shows that our method can correctly calibrate infrastructure sensors in an automated manner.
arXiv Detail & Related papers (2023-06-23T07:01:10Z) - Automated Static Camera Calibration with Intelligent Vehicles [58.908194559319405]
We present a robust calibration method for automated geo-referenced camera calibration.
Our method requires a calibration vehicle equipped with a combined filtering/RTK receiver and an inertial measurement unit (IMU) for self-localization.
Our method does not require any human interaction with the information recorded by both the infrastructure and the vehicle.
arXiv Detail & Related papers (2023-04-21T08:50:52Z) - TrajMatch: Towards Automatic Spatio-temporal Calibration for Roadside
LiDARs through Trajectory Matching [12.980324010888664]
We propose TrajMatch -- the first system that can automatically calibrate for roadside LiDARs in both time and space.
Experiment results show that TrajMatch can achieve a spatial calibration error of less than 10cm and a temporal calibration error of less than 1.5ms.
arXiv Detail & Related papers (2023-02-04T12:27:01Z) - Extrinsic Camera Calibration with Semantic Segmentation [60.330549990863624]
We present an extrinsic camera calibration approach that automatizes the parameter estimation by utilizing semantic segmentation information.
Our approach relies on a coarse initial measurement of the camera pose and builds on lidar sensors mounted on a vehicle.
We evaluate our method on simulated and real-world data to demonstrate low error measurements in the calibration results.
arXiv Detail & Related papers (2022-08-08T07:25:03Z) - CRLF: Automatic Calibration and Refinement based on Line Feature for
LiDAR and Camera in Road Scenes [16.201111055979453]
We propose a novel method to calibrate the extrinsic parameter for LiDAR and camera in road scenes.
Our method introduces line features from static straight-line-shaped objects such as road lanes and poles in both image and point cloud.
We conduct extensive experiments on KITTI and our in-house dataset, quantitative and qualitative results demonstrate the robustness and accuracy of our method.
arXiv Detail & Related papers (2021-03-08T06:02:44Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Accurate Alignment Inspection System for Low-resolution Automotive and
Mobility LiDAR [125.41260574344933]
An accurate inspection system is proposed for estimating a LiDAR alignment error after sensor attachment on a mobility system such as a vehicle or robot.
The proposed method uses only a single target board at the fixed position to estimate the three orientations (roll, tilt, and yaw) and the horizontal position of the LiDAR attachment with sub-degree and millimeter level accuracy.
arXiv Detail & Related papers (2020-08-24T17:47:59Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.