FlowCalib: LiDAR-to-Vehicle Miscalibration Detection using Scene Flows
- URL: http://arxiv.org/abs/2601.23107v1
- Date: Fri, 30 Jan 2026 15:53:16 GMT
- Title: FlowCalib: LiDAR-to-Vehicle Miscalibration Detection using Scene Flows
- Authors: Ilir Tahiraj, Peter Wittal, Markus Lienkamp,
- Abstract summary: FlowCalib is the first framework that detects LiDAR-to-vehicle miscalibration using motion cues from the scene flow of static objects.<n>Our approach leverages the systematic bias induced by rotational misalignment in the flow field generated from sequential 3D point clouds.<n> Experiments on the nuScenes dataset demonstrate FlowCalib's ability to robustly detect miscalibration.
- Score: 5.2605916208792225
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Accurate sensor-to-vehicle calibration is essential for safe autonomous driving. Angular misalignments of LiDAR sensors can lead to safety-critical issues during autonomous operation. However, current methods primarily focus on correcting sensor-to-sensor errors without considering the miscalibration of individual sensors that cause these errors in the first place. We introduce FlowCalib, the first framework that detects LiDAR-to-vehicle miscalibration using motion cues from the scene flow of static objects. Our approach leverages the systematic bias induced by rotational misalignment in the flow field generated from sequential 3D point clouds, eliminating the need for additional sensors. The architecture integrates a neural scene flow prior for flow estimation and incorporates a dual-branch detection network that fuses learned global flow features with handcrafted geometric descriptors. These combined representations allow the system to perform two complementary binary classification tasks: a global binary decision indicating whether misalignment is present and separate, axis-specific binary decisions indicating whether each rotational axis is misaligned. Experiments on the nuScenes dataset demonstrate FlowCalib's ability to robustly detect miscalibration, establishing a benchmark for sensor-to-vehicle miscalibration detection.
Related papers
- Fault detection and diagnosis for the engine electrical system of a space launcher based on a temporal convolutional autoencoder and calibrated classifiers [0.0]
This paper outlines a first step toward developing an onboard fault detection and diagnostic capability for the next generation of reusable space launchers.<n>Unlike existing approaches in the literature, our solution is designed to meet a broader range of key requirements.<n>The proposed solution is based on a temporal convolutional autoencoder to automatically extract low-dimensional features from raw sensor data.
arXiv Detail & Related papers (2025-07-17T11:50:29Z) - UniCalib: Targetless LiDAR-Camera Calibration via Probabilistic Flow on Unified Depth Representations [30.56092814783138]
DF-Calib is a LiDAR-camera calibration method that reformulates calibration as an intra-modality depth flow estimation problem.<n> DF-Calib estimates a dense depth map from the camera image and completes the sparse LiDAR projected depth map.<n>We introduce a reliability map to prioritize valid pixels and propose a perceptually weighted sparse flow loss to enhance depth flow estimation.
arXiv Detail & Related papers (2025-04-02T07:09:44Z) - CaLiV: LiDAR-to-Vehicle Calibration of Arbitrary Sensor Setups [0.8437187555622164]
In autonomous systems, sensor calibration is essential for safe and efficient navigation in dynamic environments.<n>Many existing LiDAR calibration methods require overlapping fields of view, while others use external sensing devices or postulate a feature-rich environment.<n>In this work, we propose a novel target-based technique for extrinsic Sensor-to-Sensor and Sensor-to-Vehicle calibration of multi-LiDAR systems called CaLiV.<n>This algorithm works for non-overlapping fields of view and does not require any external sensing devices.
arXiv Detail & Related papers (2025-03-31T08:08:21Z) - A re-calibration method for object detection with multi-modal alignment bias in autonomous driving [6.672552664633057]
Multi-modal object detection in autonomous driving has achieved great breakthroughs due to the usage of fusing complementary information from different sensors.<n>The calibration in fusion between sensors such as LiDAR and camera was always supposed to be precise in previous work.<n>In reality, calibration matrices are fixed when the vehicles leave the factory, but mechanical vibration, road bumps, and data lags may cause calibration bias.
arXiv Detail & Related papers (2024-05-27T05:46:37Z) - Unsupervised Domain Adaptation for Self-Driving from Past Traversal
Features [69.47588461101925]
We propose a method to adapt 3D object detectors to new driving environments.
Our approach enhances LiDAR-based detection models using spatial quantized historical features.
Experiments on real-world datasets demonstrate significant improvements.
arXiv Detail & Related papers (2023-09-21T15:00:31Z) - Sensor Fault Detection and Isolation in Autonomous Nonlinear Systems
Using Neural Network-Based Observers [6.432798111887824]
Sensor fault detection and isolation (s-FDI) method applies to a general class of nonlinear systems.
Key aspect of this approach lies in the utilization of a neural network-based Kazantzis-Kravaris/Luenberger (KKL) observer.
arXiv Detail & Related papers (2023-04-18T09:05:07Z) - Visual-tactile sensing for Real-time liquid Volume Estimation in
Grasping [58.50342759993186]
We propose a visuo-tactile model for realtime estimation of the liquid inside a deformable container.
We fuse two sensory modalities, i.e., the raw visual inputs from the RGB camera and the tactile cues from our specific tactile sensor.
The robotic system is well controlled and adjusted based on the estimation model in real time.
arXiv Detail & Related papers (2022-02-23T13:38:31Z) - The KFIoU Loss for Rotated Object Detection [115.334070064346]
In this paper, we argue that one effective alternative is to devise an approximate loss who can achieve trend-level alignment with SkewIoU loss.
Specifically, we model the objects as Gaussian distribution and adopt Kalman filter to inherently mimic the mechanism of SkewIoU.
The resulting new loss called KFIoU is easier to implement and works better compared with exact SkewIoU.
arXiv Detail & Related papers (2022-01-29T10:54:57Z) - Bayesian Autoencoders for Drift Detection in Industrial Environments [69.93875748095574]
Autoencoders are unsupervised models which have been used for detecting anomalies in multi-sensor environments.
Anomalies can come either from real changes in the environment (real drift) or from faulty sensory devices (virtual drift)
arXiv Detail & Related papers (2021-07-28T10:19:58Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.