Targetless LiDAR-Camera Calibration with Anchored 3D Gaussians
- URL: http://arxiv.org/abs/2504.04597v1
- Date: Sun, 06 Apr 2025 20:00:01 GMT
- Title: Targetless LiDAR-Camera Calibration with Anchored 3D Gaussians
- Authors: Haebeom Jung, Namtae Kim, Jungwoo Kim, Jaesik Park,
- Abstract summary: We present a targetless LiDAR-camera calibration method that jointly optimize sensor poses and scene geometry from arbitrary scenes.<n>We validate our method through extensive experiments on two real-world autonomous driving datasets.
- Score: 21.057702337896995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a targetless LiDAR-camera calibration method that jointly optimizes sensor poses and scene geometry from arbitrary scenes, without relying on traditional calibration targets such as checkerboards or spherical reflectors. Our approach leverages a 3D Gaussian-based scene representation. We first freeze reliable LiDAR points as anchors, then jointly optimize the poses and auxiliary Gaussian parameters in a fully differentiable manner using a photometric loss. This joint optimization significantly reduces sensor misalignment, resulting in higher rendering quality and consistently improved PSNR compared to the carefully calibrated poses provided in popular datasets. We validate our method through extensive experiments on two real-world autonomous driving datasets, KITTI-360 and Waymo, each featuring distinct sensor configurations. Additionally, we demonstrate the robustness of our approach using a custom LiDAR-camera setup, confirming strong performance across diverse hardware configurations.
Related papers
- Coca-Splat: Collaborative Optimization for Camera Parameters and 3D Gaussians [26.3996055215988]
Coca-Splat is a novel approach to address the challenges of sparse view pose-free scene reconstruction and novel view synthesis (NVS)<n>Inspired by deformable DEtection TRansformer, we design separate queries for 3D Gaussians and camera parameters.<n>We update them layer by layer through deformable Transformer layers, enabling joint optimization in a single network.
arXiv Detail & Related papers (2025-04-01T10:48:46Z) - Robust LiDAR-Camera Calibration with 2D Gaussian Splatting [0.3281128493853064]
A critical and initial step in integrating the LiDAR and camera data is the calibration of the LiDAR-camera system.<n>Most existing calibration methods rely on auxiliary target objects, which often involve complex manual operations.<n>We propose a calibration method that estimates LiDAR-camera extrinsic parameters using geometric constraints.
arXiv Detail & Related papers (2025-04-01T08:19:26Z) - D3DR: Lighting-Aware Object Insertion in Gaussian Splatting [48.80431740983095]
We propose a method, dubbed D3DR, for inserting a 3DGS-parametrized object into 3DGS scenes.
We leverage advances in diffusion models, which, trained on real-world data, implicitly understand correct scene lighting.
We demonstrate the method's effectiveness by comparing it to existing approaches.
arXiv Detail & Related papers (2025-03-09T19:48:00Z) - SC-OmniGS: Self-Calibrating Omnidirectional Gaussian Splatting [29.489453234466982]
SC- OmniGS is a novel self-calibrating system for fast and accurate radiance field reconstruction using 360-degree images.<n>We introduce a differentiable omnidirectional camera model in order to rectify the distortion of real-world data for performance enhancement.
arXiv Detail & Related papers (2025-02-07T08:06:30Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.
Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - P2O-Calib: Camera-LiDAR Calibration Using Point-Pair Spatial Occlusion
Relationship [1.6921147361216515]
We propose a novel target-less calibration approach based on the 2D-3D edge point extraction using the occlusion relationship in 3D space.
Our method achieves low error and high robustness that can contribute to the practical applications relying on high-quality Camera-LiDAR calibration.
arXiv Detail & Related papers (2023-11-04T14:32:55Z) - Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter
Correction [54.00007868515432]
Existing methods face challenges in estimating the accurate correction field due to the uniform velocity assumption.
We propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixels.
Our method surpasses the state-of-the-art by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively.
arXiv Detail & Related papers (2023-03-31T15:09:18Z) - Photometric LiDAR and RGB-D Bundle Adjustment [3.3948742816399697]
This paper presents a novel Bundle Adjustment (BA) photometric strategy that accounts for both RGB-D and LiDAR in the same way.
In addition, we present the benefit of jointly using RGB-D and LiDAR within our unified method.
arXiv Detail & Related papers (2023-03-29T17:35:23Z) - Benchmarking the Robustness of LiDAR-Camera Fusion for 3D Object
Detection [58.81316192862618]
Two critical sensors for 3D perception in autonomous driving are the camera and the LiDAR.
fusing these two modalities can significantly boost the performance of 3D perception models.
We benchmark the state-of-the-art fusion methods for the first time.
arXiv Detail & Related papers (2022-05-30T09:35:37Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Multi-View Photometric Stereo: A Robust Solution and Benchmark Dataset
for Spatially Varying Isotropic Materials [65.95928593628128]
We present a method to capture both 3D shape and spatially varying reflectance with a multi-view photometric stereo technique.
Our algorithm is suitable for perspective cameras and nearby point light sources.
arXiv Detail & Related papers (2020-01-18T12:26:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.