Robust LiDAR-Camera Calibration with 2D Gaussian Splatting
- URL: http://arxiv.org/abs/2504.00525v1
- Date: Tue, 01 Apr 2025 08:19:26 GMT
- Title: Robust LiDAR-Camera Calibration with 2D Gaussian Splatting
- Authors: Shuyi Zhou, Shuxiang Xie, Ryoichi Ishikawa, Takeshi Oishi,
- Abstract summary: A critical and initial step in integrating the LiDAR and camera data is the calibration of the LiDAR-camera system.<n>Most existing calibration methods rely on auxiliary target objects, which often involve complex manual operations.<n>We propose a calibration method that estimates LiDAR-camera extrinsic parameters using geometric constraints.
- Score: 0.3281128493853064
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: LiDAR-camera systems have become increasingly popular in robotics recently. A critical and initial step in integrating the LiDAR and camera data is the calibration of the LiDAR-camera system. Most existing calibration methods rely on auxiliary target objects, which often involve complex manual operations, whereas targetless methods have yet to achieve practical effectiveness. Recognizing that 2D Gaussian Splatting (2DGS) can reconstruct geometric information from camera image sequences, we propose a calibration method that estimates LiDAR-camera extrinsic parameters using geometric constraints. The proposed method begins by reconstructing colorless 2DGS using LiDAR point clouds. Subsequently, we update the colors of the Gaussian splats by minimizing the photometric loss. The extrinsic parameters are optimized during this process. Additionally, we address the limitations of the photometric loss by incorporating the reprojection and triangulation losses, thereby enhancing the calibration robustness and accuracy.
Related papers
- Targetless LiDAR-Camera Calibration with Anchored 3D Gaussians [21.057702337896995]
We present a targetless LiDAR-camera calibration method that jointly optimize sensor poses and scene geometry from arbitrary scenes.
We validate our method through extensive experiments on two real-world autonomous driving datasets.
arXiv Detail & Related papers (2025-04-06T20:00:01Z) - DF-Calib: Targetless LiDAR-Camera Calibration via Depth Flow [30.56092814783138]
DF-Calib is a LiDAR-camera calibration method that reformulates calibration as an intra-modality depth flow estimation problem.
DF-Calib estimates a dense depth map from the camera image and completes the sparse LiDAR projected depth map.
We introduce a reliability map to prioritize valid pixels and propose a perceptually weighted sparse flow loss to enhance depth flow estimation.
arXiv Detail & Related papers (2025-04-02T07:09:44Z) - CalibRefine: Deep Learning-Based Online Automatic Targetless LiDAR-Camera Calibration with Iterative and Attention-Driven Post-Refinement [5.069968819561576]
CalibRefine is a fully automatic, targetless, and online calibration framework.
We show that CalibRefine delivers high-precision calibration results with minimal human involvement.
Our findings highlight how robust object-level feature matching, together with iterative and self-supervised attention-based adjustments, enables consistent sensor fusion in complex, real-world conditions.
arXiv Detail & Related papers (2025-02-24T20:53:42Z) - What Really Matters for Learning-based LiDAR-Camera Calibration [50.2608502974106]
This paper revisits the development of learning-based LiDAR-Camera calibration.
We identify the critical limitations of regression-based methods with the widely used data generation pipeline.
We also investigate how the input data format and preprocessing operations impact network performance.
arXiv Detail & Related papers (2025-01-28T14:12:32Z) - LiDAR-GS:Real-time LiDAR Re-Simulation using Gaussian Splatting [50.808933338389686]
We present LiDAR-GS, a real-time, high-fidelity re-simulation of LiDAR scans in public urban road scenes.<n>The method achieves state-of-the-art results in both rendering frame rate and quality on publically available large scene datasets.
arXiv Detail & Related papers (2024-10-07T15:07:56Z) - YOCO: You Only Calibrate Once for Accurate Extrinsic Parameter in LiDAR-Camera Systems [0.5999777817331317]
In a multi-sensor fusion system composed of cameras and LiDAR, precise extrinsic calibration contributes to the system's long-term stability and accurate perception of the environment.
This paper proposes a novel fully automatic extrinsic calibration method for LiDAR-camera systems that circumvents the need for corresponding point registration.
arXiv Detail & Related papers (2024-07-25T13:44:49Z) - P2O-Calib: Camera-LiDAR Calibration Using Point-Pair Spatial Occlusion
Relationship [1.6921147361216515]
We propose a novel target-less calibration approach based on the 2D-3D edge point extraction using the occlusion relationship in 3D space.
Our method achieves low error and high robustness that can contribute to the practical applications relying on high-quality Camera-LiDAR calibration.
arXiv Detail & Related papers (2023-11-04T14:32:55Z) - RGB-based Category-level Object Pose Estimation via Decoupled Metric
Scale Recovery [72.13154206106259]
We propose a novel pipeline that decouples the 6D pose and size estimation to mitigate the influence of imperfect scales on rigid transformations.
Specifically, we leverage a pre-trained monocular estimator to extract local geometric information.
A separate branch is designed to directly recover the metric scale of the object based on category-level statistics.
arXiv Detail & Related papers (2023-09-19T02:20:26Z) - Continuous Online Extrinsic Calibration of Fisheye Camera and LiDAR [7.906477322731106]
An accurate extrinsic calibration is required to fuse the camera and LiDAR data into a common spatial reference frame required by high-level perception functions.
There is a need for continuous online extrinsic calibration algorithms which can automatically update the value of the camera-LiDAR calibration during the life of the vehicle using only sensor data.
We propose using mutual information between the camera image's depth estimate, provided by commonly available monocular depth estimation networks, and the LiDAR pointcloud's geometric distance as a optimization metric for extrinsic calibration.
arXiv Detail & Related papers (2023-06-22T23:16:31Z) - Improving Extrinsics between RADAR and LIDAR using Learning [18.211513930388417]
This paper presents a novel solution for 3D RADAR-LIDAR calibration in autonomous systems.
The method employs simple targets to generate data, including correspondence registration and a one-step optimization algorithm.
The proposed approach uses a deep learning framework such as PyTorch and can be optimized through gradient descent.
arXiv Detail & Related papers (2023-05-17T22:04:29Z) - Detecting Rotated Objects as Gaussian Distributions and Its 3-D
Generalization [81.29406957201458]
Existing detection methods commonly use a parameterized bounding box (BBox) to model and detect (horizontal) objects.
We argue that such a mechanism has fundamental limitations in building an effective regression loss for rotation detection.
We propose to model the rotated objects as Gaussian distributions.
We extend our approach from 2-D to 3-D with a tailored algorithm design to handle the heading estimation.
arXiv Detail & Related papers (2022-09-22T07:50:48Z) - TEScalib: Targetless Extrinsic Self-Calibration of LiDAR and Stereo
Camera for Automated Driving Vehicles with Uncertainty Analysis [4.616329048951671]
TEScalib is a novel extrinsic self-calibration approach of LiDAR and stereo camera.
It uses the geometric and photometric information of surrounding environments without any calibration targets for automated driving vehicles.
Our approach evaluated on the KITTI dataset achieves very promising results.
arXiv Detail & Related papers (2022-02-28T15:04:00Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - SelfVoxeLO: Self-supervised LiDAR Odometry with Voxel-based Deep Neural
Networks [81.64530401885476]
We propose a self-supervised LiDAR odometry method, dubbed SelfVoxeLO, to tackle these two difficulties.
Specifically, we propose a 3D convolution network to process the raw LiDAR data directly, which extracts features that better encode the 3D geometric patterns.
We evaluate our method's performances on two large-scale datasets, i.e., KITTI and Apollo-SouthBay.
arXiv Detail & Related papers (2020-10-19T09:23:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.