CaLiV: LiDAR-to-Vehicle Calibration of Arbitrary Sensor Setups via Object Reconstruction
- URL: http://arxiv.org/abs/2504.01987v1
- Date: Mon, 31 Mar 2025 08:08:21 GMT
- Title: CaLiV: LiDAR-to-Vehicle Calibration of Arbitrary Sensor Setups via Object Reconstruction
- Authors: Ilir Tahiraj, Markus Edinger, Dominik Kulmer, Markus Lienkamp,
- Abstract summary: In autonomous systems, sensor calibration is essential for a safe and efficient navigation in dynamic environments.<n>Many existing LiDAR calibration methods require overlapping fields of view, while others use external sensing devices or postulate a feature-rich environment.<n>In this work, we propose a novel target-based technique for extrinsic Sensor-to-Sensor and Sensor-to-Vehicle calibration of multi-LiDAR systems called CaLiV.<n>This algorithm works for non-overlapping FoVs, as well as arbitrary calibration targets, and does not require any external sensing devices.
- Score: 0.8437187555622164
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In autonomous systems, sensor calibration is essential for a safe and efficient navigation in dynamic environments. Accurate calibration is a prerequisite for reliable perception and planning tasks such as object detection and obstacle avoidance. Many existing LiDAR calibration methods require overlapping fields of view, while others use external sensing devices or postulate a feature-rich environment. In addition, Sensor-to-Vehicle calibration is not supported by the vast majority of calibration algorithms. In this work, we propose a novel target-based technique for extrinsic Sensor-to-Sensor and Sensor-to-Vehicle calibration of multi-LiDAR systems called CaLiV. This algorithm works for non-overlapping FoVs, as well as arbitrary calibration targets, and does not require any external sensing devices. First, we apply motion to produce FoV overlaps and utilize a simple unscented Kalman filter to obtain vehicle poses. Then, we use the Gaussian mixture model-based registration framework GMMCalib to align the point clouds in a common calibration frame. Finally, we reduce the task of recovering the sensor extrinsics to a minimization problem. We show that both translational and rotational Sensor-to-Sensor errors can be solved accurately by our method. In addition, all Sensor-to-Vehicle rotation angles can also be calibrated with high accuracy. We validate the simulation results in real-world experiments. The code is open source and available on https://github.com/TUMFTM/CaLiV.
Related papers
- Cal or No Cal? -- Real-Time Miscalibration Detection of LiDAR and Camera Sensors [0.8437187555622164]
From a safety perspective, sensor calibration is a key enabler of autonomous driving.
Online calibration is subject to strict real-time and resource constraints.
We propose a miscalibration detection framework that shifts the focus from the direct regression of calibration parameters to a binary classification of the calibration state.
arXiv Detail & Related papers (2025-03-31T08:13:23Z) - UniCal: Unified Neural Sensor Calibration [32.7372115947273]
Self-driving vehicles (SDVs) require accurate calibration of LiDARs and cameras to fuse sensor data accurately for autonomy.
Traditional calibration methods leverage fiducials captured in a controlled and structured scene and compute correspondences to optimize over.
We propose UniCal, a unified framework for effortlessly calibrating SDVs equipped with multiple LiDARs and cameras.
arXiv Detail & Related papers (2024-09-27T17:56:04Z) - Kalib: Easy Hand-Eye Calibration with Reference Point Tracking [52.4190876409222]
Kalib is an automatic hand-eye calibration method that leverages the generalizability of visual foundation models to overcome challenges.<n>During calibration, a kinematic reference point is tracked in the camera coordinate 3D coordinates in the space behind the robot.<n>Kalib's user-friendly design and minimal setup requirements make it a possible solution for continuous operation in unstructured environments.
arXiv Detail & Related papers (2024-08-20T06:03:40Z) - Learning to Make Keypoints Sub-Pixel Accurate [80.55676599677824]
This work addresses the challenge of sub-pixel accuracy in detecting 2D local features.
We propose a novel network that enhances any detector with sub-pixel precision by learning an offset vector for detected features.
arXiv Detail & Related papers (2024-07-16T12:39:56Z) - SOAC: Spatio-Temporal Overlap-Aware Multi-Sensor Calibration using Neural Radiance Fields [10.958143040692141]
In rapidly-evolving domains such as autonomous driving, the use of multiple sensors with different modalities is crucial to ensure operational precision and stability.
To correctly exploit the provided information by each sensor in a single common frame, it is essential for these sensors to be accurately calibrated.
We leverage the ability of Neural Radiance Fields to represent different modalities in a common representation.
arXiv Detail & Related papers (2023-11-27T13:25:47Z) - CROON: Automatic Multi-LiDAR Calibration and Refinement Method in Road
Scene [15.054452813705112]
CROON (automatiC multi-LiDAR CalibratiOn and Refinement method in rOad sceNe) is a two-stage method including rough and refinement calibration.
Results on real-world and simulated data sets demonstrate the reliability and accuracy of our method.
arXiv Detail & Related papers (2022-03-07T07:36:31Z) - Real-time detection of uncalibrated sensors using Neural Networks [62.997667081978825]
An online machine-learning based uncalibration detector for temperature, humidity and pressure sensors was developed.
The solution integrates an Artificial Neural Network as main component which learns from the behavior of the sensors under calibrated conditions.
The obtained results show that the proposed solution is able to detect uncalibrations for deviation values of 0.25 degrees, 1% RH and 1.5 Pa, respectively.
arXiv Detail & Related papers (2021-02-02T15:44:39Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Accurate Alignment Inspection System for Low-resolution Automotive and
Mobility LiDAR [125.41260574344933]
An accurate inspection system is proposed for estimating a LiDAR alignment error after sensor attachment on a mobility system such as a vehicle or robot.
The proposed method uses only a single target board at the fixed position to estimate the three orientations (roll, tilt, and yaw) and the horizontal position of the LiDAR attachment with sub-degree and millimeter level accuracy.
arXiv Detail & Related papers (2020-08-24T17:47:59Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z) - Deep Soft Procrustes for Markerless Volumetric Sensor Alignment [81.13055566952221]
In this work, we improve markerless data-driven correspondence estimation to achieve more robust multi-sensor spatial alignment.
We incorporate geometric constraints in an end-to-end manner into a typical segmentation based model and bridge the intermediate dense classification task with the targeted pose estimation one.
Our model is experimentally shown to achieve similar results with marker-based methods and outperform the markerless ones, while also being robust to the pose variations of the calibration structure.
arXiv Detail & Related papers (2020-03-23T10:51:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.