Cal or No Cal? -- Real-Time Miscalibration Detection of LiDAR and Camera Sensors
- URL: http://arxiv.org/abs/2504.01040v1
- Date: Mon, 31 Mar 2025 08:13:23 GMT
- Title: Cal or No Cal? -- Real-Time Miscalibration Detection of LiDAR and Camera Sensors
- Authors: Ilir Tahiraj, Jeremialie Swadiryus, Felix Fent, Markus Lienkamp,
- Abstract summary: From a safety perspective, sensor calibration is a key enabler of autonomous driving.<n>Online calibration is subject to strict real-time and resource constraints.<n>We propose a miscalibration detection framework that shifts the focus from the direct regression of calibration parameters to a binary classification of the calibration state.
- Score: 0.8437187555622164
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The goal of extrinsic calibration is the alignment of sensor data to ensure an accurate representation of the surroundings and enable sensor fusion applications. From a safety perspective, sensor calibration is a key enabler of autonomous driving. In the current state of the art, a trend from target-based offline calibration towards targetless online calibration can be observed. However, online calibration is subject to strict real-time and resource constraints which are not met by state-of-the-art methods. This is mainly due to the high number of parameters to estimate, the reliance on geometric features, or the dependence on specific vehicle maneuvers. To meet these requirements and ensure the vehicle's safety at any time, we propose a miscalibration detection framework that shifts the focus from the direct regression of calibration parameters to a binary classification of the calibration state, i.e., calibrated or miscalibrated. Therefore, we propose a contrastive learning approach that compares embedded features in a latent space to classify the calibration state of two different sensor modalities. Moreover, we provide a comprehensive analysis of the feature embeddings and challenging calibration errors that highlight the performance of our approach. As a result, our method outperforms the current state-of-the-art in terms of detection performance, inference time, and resource demand. The code is open source and available on https://github.com/TUMFTM/MiscalibrationDetection.
Related papers
- CalibRefine: Deep Learning-Based Online Automatic Targetless LiDAR-Camera Calibration with Iterative and Attention-Driven Post-Refinement [5.069968819561576]
CalibRefine is a fully automatic, targetless, and online calibration framework.
We show that CalibRefine delivers high-precision calibration results with minimal human involvement.
Our findings highlight how robust object-level feature matching, together with iterative and self-supervised attention-based adjustments, enables consistent sensor fusion in complex, real-world conditions.
arXiv Detail & Related papers (2025-02-24T20:53:42Z) - UniCal: Unified Neural Sensor Calibration [32.7372115947273]
Self-driving vehicles (SDVs) require accurate calibration of LiDARs and cameras to fuse sensor data accurately for autonomy.
Traditional calibration methods leverage fiducials captured in a controlled and structured scene and compute correspondences to optimize over.
We propose UniCal, a unified framework for effortlessly calibrating SDVs equipped with multiple LiDARs and cameras.
arXiv Detail & Related papers (2024-09-27T17:56:04Z) - A re-calibration method for object detection with multi-modal alignment bias in autonomous driving [7.601405124830806]
Multi-modal object detection in autonomous driving has achieved great breakthroughs due to the usage of fusing complementary information from different sensors.
In reality, calibration matrices are fixed when the vehicles leave the factory, but vibration, bumps, and data lags may cause calibration bias.
We conducted experiments on SOTA detection method EPNet++ and proved slight bias on calibration can reduce the performance seriously.
arXiv Detail & Related papers (2024-05-27T05:46:37Z) - Towards Certification of Uncertainty Calibration under Adversarial Attacks [96.48317453951418]
We show that attacks can significantly harm calibration, and thus propose certified calibration as worst-case bounds on calibration under adversarial perturbations.<n>We propose novel calibration attacks and demonstrate how they can improve model calibration through textitadversarial calibration training
arXiv Detail & Related papers (2024-05-22T18:52:09Z) - CalibFormer: A Transformer-based Automatic LiDAR-Camera Calibration Network [11.602943913324653]
CalibFormer is an end-to-end network for automatic LiDAR-camera calibration.
We aggregate multiple layers of camera and LiDAR image features to achieve high-resolution representations.
Our method achieved a mean translation error of $0.8751 mathrmcm$ and a mean rotation error of $0.0562 circ$ on the KITTI dataset.
arXiv Detail & Related papers (2023-11-26T08:59:30Z) - Calibration by Distribution Matching: Trainable Kernel Calibration
Metrics [56.629245030893685]
We introduce kernel-based calibration metrics that unify and generalize popular forms of calibration for both classification and regression.
These metrics admit differentiable sample estimates, making it easy to incorporate a calibration objective into empirical risk minimization.
We provide intuitive mechanisms to tailor calibration metrics to a decision task, and enforce accurate loss estimation and no regret decisions.
arXiv Detail & Related papers (2023-10-31T06:19:40Z) - TrajMatch: Towards Automatic Spatio-temporal Calibration for Roadside
LiDARs through Trajectory Matching [12.980324010888664]
We propose TrajMatch -- the first system that can automatically calibrate for roadside LiDARs in both time and space.
Experiment results show that TrajMatch can achieve a spatial calibration error of less than 10cm and a temporal calibration error of less than 1.5ms.
arXiv Detail & Related papers (2023-02-04T12:27:01Z) - Localized Calibration: Metrics and Recalibration [133.07044916594361]
We propose a fine-grained calibration metric that spans the gap between fully global and fully individualized calibration.
We then introduce a localized recalibration method, LoRe, that improves the LCE better than existing recalibration methods.
arXiv Detail & Related papers (2021-02-22T07:22:12Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Unsupervised Calibration under Covariate Shift [92.02278658443166]
We introduce the problem of calibration under domain shift and propose an importance sampling based approach to address it.
We evaluate and discuss the efficacy of our method on both real-world datasets and synthetic datasets.
arXiv Detail & Related papers (2020-06-29T21:50:07Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.