TrajMatch: Towards Automatic Spatio-temporal Calibration for Roadside
LiDARs through Trajectory Matching
- URL: http://arxiv.org/abs/2302.02157v1
- Date: Sat, 4 Feb 2023 12:27:01 GMT
- Title: TrajMatch: Towards Automatic Spatio-temporal Calibration for Roadside
LiDARs through Trajectory Matching
- Authors: Haojie Ren, Sha Zhang, Sugang Li, Yao Li, Xinchen Li, Jianmin Ji, Yu
Zhang, Yanyong Zhang
- Abstract summary: We propose TrajMatch -- the first system that can automatically calibrate for roadside LiDARs in both time and space.
Experiment results show that TrajMatch can achieve a spatial calibration error of less than 10cm and a temporal calibration error of less than 1.5ms.
- Score: 12.980324010888664
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, it has become popular to deploy sensors such as LiDARs on the
roadside to monitor the passing traffic and assist autonomous vehicle
perception. Unlike autonomous vehicle systems, roadside sensors are usually
affiliated with different subsystems and lack synchronization both in time and
space. Calibration is a key technology which allows the central server to fuse
the data generated by different location infrastructures, which can deliver
improve the sensing range and detection robustness. Unfortunately, existing
calibration algorithms often assume that the LiDARs are significantly
overlapped or that the temporal calibration is already achieved. Since these
assumptions do not always hold in the real world, the calibration results from
the existing algorithms are often unsatisfactory and always need human
involvement, which brings high labor costs. In this paper, we propose TrajMatch
-- the first system that can automatically calibrate for roadside LiDARs in
both time and space. The main idea is to automatically calibrate the sensors
based on the result of the detection/tracking task instead of extracting
special features. More deeply, we propose a mechanism for evaluating
calibration parameters that is consistent with our algorithm, and we
demonstrate the effectiveness of this scheme experimentally, which can also be
used to guide parameter iterations for multiple calibration. Finally, to
evaluate the performance of TrajMatch , we collect two dataset, one simulated
dataset LiDARnet-sim 1.0 and a real-world dataset. Experiment results show that
TrajMatch can achieve a spatial calibration error of less than 10cm and a
temporal calibration error of less than 1.5ms.
Related papers
- UniCal: Unified Neural Sensor Calibration [32.7372115947273]
Self-driving vehicles (SDVs) require accurate calibration of LiDARs and cameras to fuse sensor data accurately for autonomy.
Traditional calibration methods leverage fiducials captured in a controlled and structured scene and compute correspondences to optimize over.
We propose UniCal, a unified framework for effortlessly calibrating SDVs equipped with multiple LiDARs and cameras.
arXiv Detail & Related papers (2024-09-27T17:56:04Z) - Kalib: Markerless Hand-Eye Calibration with Keypoint Tracking [52.4190876409222]
Hand-eye calibration involves estimating the transformation between the camera and the robot.
Recent advancements in deep learning offer markerless techniques, but they present challenges.
We propose Kalib, an automatic and universal markerless hand-eye calibration pipeline.
arXiv Detail & Related papers (2024-08-20T06:03:40Z) - A re-calibration method for object detection with multi-modal alignment bias in autonomous driving [7.601405124830806]
Multi-modal object detection in autonomous driving has achieved great breakthroughs due to the usage of fusing complementary information from different sensors.
In reality, calibration matrices are fixed when the vehicles leave the factory, but vibration, bumps, and data lags may cause calibration bias.
We conducted experiments on SOTA detection method EPNet++ and proved slight bias on calibration can reduce the performance seriously.
arXiv Detail & Related papers (2024-05-27T05:46:37Z) - Unsupervised Domain Adaptation for Self-Driving from Past Traversal
Features [69.47588461101925]
We propose a method to adapt 3D object detectors to new driving environments.
Our approach enhances LiDAR-based detection models using spatial quantized historical features.
Experiments on real-world datasets demonstrate significant improvements.
arXiv Detail & Related papers (2023-09-21T15:00:31Z) - Automated Automotive Radar Calibration With Intelligent Vehicles [73.15674960230625]
We present an approach for automated and geo-referenced calibration of automotive radar sensors.
Our method does not require external modifications of a vehicle and instead uses the location data obtained from automated vehicles.
Our evaluation on data from a real testing site shows that our method can correctly calibrate infrastructure sensors in an automated manner.
arXiv Detail & Related papers (2023-06-23T07:01:10Z) - Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR [7.906477322731106]
An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions.
There is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data.
We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration.
arXiv Detail & Related papers (2023-06-22T23:16:31Z) - Automated Static Camera Calibration with Intelligent Vehicles [58.908194559319405]
We present a robust calibration method for automated geo-referenced camera calibration.
Our method requires a calibration vehicle equipped with a combined filtering/RTK receiver and an inertial measurement unit (IMU) for self-localization.
Our method does not require any human interaction with the information recorded by both the infrastructure and the vehicle.
arXiv Detail & Related papers (2023-04-21T08:50:52Z) - CROON: Automatic Multi-LiDAR Calibration and Refinement Method in Road
Scene [15.054452813705112]
CROON (automatiC multi-LiDAR CalibratiOn and Refinement method in rOad sceNe) is a two-stage method including rough and refinement calibration.
Results on real-world and simulated data sets demonstrate the reliability and accuracy of our method.
arXiv Detail & Related papers (2022-03-07T07:36:31Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.