Online Camera-to-ground Calibration for Autonomous Driving
- URL: http://arxiv.org/abs/2303.17137v2
- Date: Fri, 31 Mar 2023 05:44:48 GMT
- Title: Online Camera-to-ground Calibration for Autonomous Driving
- Authors: Binbin Li, Xinyu Du, Yao Hu, Hao Yu, Wende Zhang
- Abstract summary: We propose an online monocular camera-to-ground calibration solution that does not utilize any specific targets while driving.
We provide metrics to quantify calibration performance and stopping criteria to report/broadcast our satisfying calibration results.
- Score: 26.357898919134833
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Online camera-to-ground calibration is to generate a non-rigid body
transformation between the camera and the road surface in a real-time manner.
Existing solutions utilize static calibration, suffering from environmental
variations such as tire pressure changes, vehicle loading volume variations,
and road surface diversity. Other online solutions exploit the usage of road
elements or photometric consistency between overlapping views across images,
which require continuous detection of specific targets on the road or
assistance with multiple cameras to facilitate calibration. In our work, we
propose an online monocular camera-to-ground calibration solution that does not
utilize any specific targets while driving. We perform a coarse-to-fine
approach for ground feature extraction through wheel odometry and estimate the
camera-to-ground calibration parameters through a sliding-window-based factor
graph optimization. Considering the non-rigid transformation of
camera-to-ground while driving, we provide metrics to quantify calibration
performance and stopping criteria to report/broadcast our satisfying
calibration results. Extensive experiments using real-world data demonstrate
that our algorithm is effective and outperforms state-of-the-art techniques.
Related papers
- Automated Automotive Radar Calibration With Intelligent Vehicles [73.15674960230625]
We present an approach for automated and geo-referenced calibration of automotive radar sensors.
Our method does not require external modifications of a vehicle and instead uses the location data obtained from automated vehicles.
Our evaluation on data from a real testing site shows that our method can correctly calibrate infrastructure sensors in an automated manner.
arXiv Detail & Related papers (2023-06-23T07:01:10Z) - Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR [7.906477322731106]
An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions.
There is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data.
We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration.
arXiv Detail & Related papers (2023-06-22T23:16:31Z) - E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event
Cameras [34.71767308204867]
We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
arXiv Detail & Related papers (2023-06-15T12:16:38Z) - EasyHeC: Accurate and Automatic Hand-eye Calibration via Differentiable
Rendering and Space Exploration [49.90228618894857]
We introduce a new approach to hand-eye calibration called EasyHeC, which is markerless, white-box, and delivers superior accuracy and robustness.
We propose to use two key technologies: differentiable rendering-based camera pose optimization and consistency-based joint space exploration.
Our evaluation demonstrates superior performance in synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-02T03:49:54Z) - Automated Static Camera Calibration with Intelligent Vehicles [58.908194559319405]
We present a robust calibration method for automated geo-referenced camera calibration.
Our method requires a calibration vehicle equipped with a combined filtering/RTK receiver and an inertial measurement unit (IMU) for self-localization.
Our method does not require any human interaction with the information recorded by both the infrastructure and the vehicle.
arXiv Detail & Related papers (2023-04-21T08:50:52Z) - Extrinsic Camera Calibration with Semantic Segmentation [60.330549990863624]
We present an extrinsic camera calibration approach that automatizes the parameter estimation by utilizing semantic segmentation information.
Our approach relies on a coarse initial measurement of the camera pose and builds on lidar sensors mounted on a vehicle.
We evaluate our method on simulated and real-world data to demonstrate low error measurements in the calibration results.
arXiv Detail & Related papers (2022-08-08T07:25:03Z) - Self-Supervised Camera Self-Calibration from Video [34.35533943247917]
We propose a learning algorithm to regress per-sequence calibration parameters using an efficient family of general camera models.
Our procedure achieves self-calibration results with sub-pixel reprojection error, outperforming other learning-based methods.
arXiv Detail & Related papers (2021-12-06T19:42:05Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - CRLF: Automatic Calibration and Refinement based on Line Feature for
LiDAR and Camera in Road Scenes [16.201111055979453]
We propose a novel method to calibrate the extrinsic parameter for LiDAR and camera in road scenes.
Our method introduces line features from static straight-line-shaped objects such as road lanes and poles in both image and point cloud.
We conduct extensive experiments on KITTI and our in-house dataset, quantitative and qualitative results demonstrate the robustness and accuracy of our method.
arXiv Detail & Related papers (2021-03-08T06:02:44Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Superaccurate Camera Calibration via Inverse Rendering [0.19336815376402716]
We propose a new method for camera calibration using the principle of inverse rendering.
Instead of relying solely on detected feature points, we use an estimate of the internal parameters and the pose of the calibration object to implicitly render a non-photorealistic equivalent of the optical features.
arXiv Detail & Related papers (2020-03-20T10:26:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.