Automatic LiDAR Extrinsic Calibration System using Photodetector and
Planar Board for Large-scale Applications
- URL: http://arxiv.org/abs/2008.10542v1
- Date: Mon, 24 Aug 2020 16:28:40 GMT
- Title: Automatic LiDAR Extrinsic Calibration System using Photodetector and
Planar Board for Large-scale Applications
- Authors: Ji-Hwan You, Seon Taek Oh, Jae-Eun Park, Azim Eskandarian, and
Young-Keun Kim
- Abstract summary: This study proposes a new concept of a target board with embedded photodetector arrays, named the PD-target system, to find the precise position of the correspondence laser beams on the target surface.
The experimental evaluation of the proposed system on low-resolution LiDAR showed that the LiDAR offset pose can be estimated within 0.1 degree and 3 mm levels of precision.
- Score: 110.32028864986918
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a novel automatic calibration system to estimate the
extrinsic parameters of LiDAR mounted on a mobile platform for sensor
misalignment inspection in the large-scale production of highly automated
vehicles. To obtain subdegree and subcentimeter accuracy levels of extrinsic
calibration, this study proposed a new concept of a target board with embedded
photodetector arrays, named the PD-target system, to find the precise position
of the correspondence laser beams on the target surface. Furthermore, the
proposed system requires only the simple design of the target board at the
fixed pose in a close range to be readily applicable in the automobile
manufacturing environment. The experimental evaluation of the proposed system
on low-resolution LiDAR showed that the LiDAR offset pose can be estimated
within 0.1 degree and 3 mm levels of precision. The high accuracy and
simplicity of the proposed calibration system make it practical for large-scale
applications for the reliability and safety of autonomous systems.
Related papers
- YOCO: You Only Calibrate Once for Accurate Extrinsic Parameter in LiDAR-Camera Systems [0.5999777817331317]
In a multi-sensor fusion system composed of cameras and LiDAR, precise extrinsic calibration contributes to the system's long-term stability and accurate perception of the environment.
This paper proposes a novel fully automatic extrinsic calibration method for LiDAR-camera systems that circumvents the need for corresponding point registration.
arXiv Detail & Related papers (2024-07-25T13:44:49Z) - Galibr: Targetless LiDAR-Camera Extrinsic Calibration Method via Ground Plane Initialization [13.409482818102878]
Galibr is a fully automatic LiDAR-camera extrinsic calibration tool designed for ground vehicle platforms in any natural setting.
The method utilizes the ground planes and edge information from both LiDAR and camera inputs, streamlining the calibration process.
Our approach significantly enhances calibration performance, primarily attributed to our novel initial pose estimation method.
arXiv Detail & Related papers (2024-06-14T08:25:10Z) - Automated Automotive Radar Calibration With Intelligent Vehicles [73.15674960230625]
We present an approach for automated and geo-referenced calibration of automotive radar sensors.
Our method does not require external modifications of a vehicle and instead uses the location data obtained from automated vehicles.
Our evaluation on data from a real testing site shows that our method can correctly calibrate infrastructure sensors in an automated manner.
arXiv Detail & Related papers (2023-06-23T07:01:10Z) - Support Vector Machine for Determining Euler Angles in an Inertial
Navigation System [55.41644538483948]
The paper discusses the improvement of the accuracy of an inertial navigation system created on the basis of MEMS sensors using machine learning (ML) methods.
The proposed algorithm based on MO has demonstrated its ability to correctly classify in the presence of noise typical for MEMS sensors.
arXiv Detail & Related papers (2022-12-07T10:01:11Z) - Visual-tactile sensing for Real-time liquid Volume Estimation in
Grasping [58.50342759993186]
We propose a visuo-tactile model for realtime estimation of the liquid inside a deformable container.
We fuse two sensory modalities, i.e., the raw visual inputs from the RGB camera and the tactile cues from our specific tactile sensor.
The robotic system is well controlled and adjusted based on the estimation model in real time.
arXiv Detail & Related papers (2022-02-23T13:38:31Z) - CRLF: Automatic Calibration and Refinement based on Line Feature for
LiDAR and Camera in Road Scenes [16.201111055979453]
We propose a novel method to calibrate the extrinsic parameter for LiDAR and camera in road scenes.
Our method introduces line features from static straight-line-shaped objects such as road lanes and poles in both image and point cloud.
We conduct extensive experiments on KITTI and our in-house dataset, quantitative and qualitative results demonstrate the robustness and accuracy of our method.
arXiv Detail & Related papers (2021-03-08T06:02:44Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - ACSC: Automatic Calibration for Non-repetitive Scanning Solid-State
LiDAR and Camera Systems [11.787271829250805]
Solid-State LiDAR (SSL) enables low-cost and efficient obtainment of 3D point clouds from the environment.
We propose a fully automatic calibration method for the non-repetitive scanning SSL and camera systems.
We evaluate the proposed method on different types of LiDAR and camera sensor combinations in real conditions.
arXiv Detail & Related papers (2020-11-17T09:11:28Z) - Accurate Alignment Inspection System for Low-resolution Automotive and
Mobility LiDAR [125.41260574344933]
An accurate inspection system is proposed for estimating a LiDAR alignment error after sensor attachment on a mobility system such as a vehicle or robot.
The proposed method uses only a single target board at the fixed position to estimate the three orientations (roll, tilt, and yaw) and the horizontal position of the LiDAR attachment with sub-degree and millimeter level accuracy.
arXiv Detail & Related papers (2020-08-24T17:47:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.