CRLF: Automatic Calibration and Refinement based on Line Feature for
LiDAR and Camera in Road Scenes
- URL: http://arxiv.org/abs/2103.04558v1
- Date: Mon, 8 Mar 2021 06:02:44 GMT
- Title: CRLF: Automatic Calibration and Refinement based on Line Feature for
LiDAR and Camera in Road Scenes
- Authors: Tao Ma, Zhizheng Liu, Guohang Yan, Yikang Li
- Abstract summary: We propose a novel method to calibrate the extrinsic parameter for LiDAR and camera in road scenes.
Our method introduces line features from static straight-line-shaped objects such as road lanes and poles in both image and point cloud.
We conduct extensive experiments on KITTI and our in-house dataset, quantitative and qualitative results demonstrate the robustness and accuracy of our method.
- Score: 16.201111055979453
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For autonomous vehicles, an accurate calibration for LiDAR and camera is a
prerequisite for multi-sensor perception systems. However, existing calibration
techniques require either a complicated setting with various calibration
targets, or an initial calibration provided beforehand, which greatly impedes
their applicability in large-scale autonomous vehicle deployment. To tackle
these issues, we propose a novel method to calibrate the extrinsic parameter
for LiDAR and camera in road scenes. Our method introduces line features from
static straight-line-shaped objects such as road lanes and poles in both image
and point cloud and formulates the initial calibration of extrinsic parameters
as a perspective-3-lines (P3L) problem. Subsequently, a cost function defined
under the semantic constraints of the line features is designed to perform
refinement on the solved coarse calibration. The whole procedure is fully
automatic and user-friendly without the need to adjust environment settings or
provide an initial calibration. We conduct extensive experiments on KITTI and
our in-house dataset, quantitative and qualitative results demonstrate the
robustness and accuracy of our method.
Related papers
- Kalib: Markerless Hand-Eye Calibration with Keypoint Tracking [52.4190876409222]
Hand-eye calibration involves estimating the transformation between the camera and the robot.
Recent advancements in deep learning offer markerless techniques, but they present challenges.
We propose Kalib, an automatic and universal markerless hand-eye calibration pipeline.
arXiv Detail & Related papers (2024-08-20T06:03:40Z) - YOCO: You Only Calibrate Once for Accurate Extrinsic Parameter in LiDAR-Camera Systems [0.5999777817331317]
In a multi-sensor fusion system composed of cameras and LiDAR, precise extrinsic calibration contributes to the system's long-term stability and accurate perception of the environment.
This paper proposes a novel fully automatic extrinsic calibration method for LiDAR-camera systems that circumvents the need for corresponding point registration.
arXiv Detail & Related papers (2024-07-25T13:44:49Z) - Galibr: Targetless LiDAR-Camera Extrinsic Calibration Method via Ground Plane Initialization [13.409482818102878]
Galibr is a fully automatic LiDAR-camera extrinsic calibration tool designed for ground vehicle platforms in any natural setting.
The method utilizes the ground planes and edge information from both LiDAR and camera inputs, streamlining the calibration process.
Our approach significantly enhances calibration performance, primarily attributed to our novel initial pose estimation method.
arXiv Detail & Related papers (2024-06-14T08:25:10Z) - EdgeCalib: Multi-Frame Weighted Edge Features for Automatic Targetless
LiDAR-Camera Calibration [15.057994140880373]
We introduce an edge-based approach for automatic online calibration of LiDAR and cameras in real-world scenarios.
The edge features, which are prevalent in various environments, are aligned in both images and point clouds to determine the extrinsic parameters.
The results show a state-of-the-art rotation accuracy of 0.086deg and a translation accuracy of 0.977 cm, outperforming existing edge-based calibration methods in both precision and robustness.
arXiv Detail & Related papers (2023-10-25T13:27:56Z) - Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR [7.906477322731106]
An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions.
There is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data.
We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration.
arXiv Detail & Related papers (2023-06-22T23:16:31Z) - Automated Static Camera Calibration with Intelligent Vehicles [58.908194559319405]
We present a robust calibration method for automated geo-referenced camera calibration.
Our method requires a calibration vehicle equipped with a combined filtering/RTK receiver and an inertial measurement unit (IMU) for self-localization.
Our method does not require any human interaction with the information recorded by both the infrastructure and the vehicle.
arXiv Detail & Related papers (2023-04-21T08:50:52Z) - CROON: Automatic Multi-LiDAR Calibration and Refinement Method in Road
Scene [15.054452813705112]
CROON (automatiC multi-LiDAR CalibratiOn and Refinement method in rOad sceNe) is a two-stage method including rough and refinement calibration.
Results on real-world and simulated data sets demonstrate the reliability and accuracy of our method.
arXiv Detail & Related papers (2022-03-07T07:36:31Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Accurate Alignment Inspection System for Low-resolution Automotive and
Mobility LiDAR [125.41260574344933]
An accurate inspection system is proposed for estimating a LiDAR alignment error after sensor attachment on a mobility system such as a vehicle or robot.
The proposed method uses only a single target board at the fixed position to estimate the three orientations (roll, tilt, and yaw) and the horizontal position of the LiDAR attachment with sub-degree and millimeter level accuracy.
arXiv Detail & Related papers (2020-08-24T17:47:59Z) - Automatic LiDAR Extrinsic Calibration System using Photodetector and
Planar Board for Large-scale Applications [110.32028864986918]
This study proposes a new concept of a target board with embedded photodetector arrays, named the PD-target system, to find the precise position of the correspondence laser beams on the target surface.
The experimental evaluation of the proposed system on low-resolution LiDAR showed that the LiDAR offset pose can be estimated within 0.1 degree and 3 mm levels of precision.
arXiv Detail & Related papers (2020-08-24T16:28:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.