Flexible Camera Calibration using a Collimator System
- URL: http://arxiv.org/abs/2512.16113v1
- Date: Thu, 18 Dec 2025 03:06:50 GMT
- Title: Flexible Camera Calibration using a Collimator System
- Authors: Shunkun Liang, Banglei Guan, Zhenbao Yu, Dongcai Tan, Pengju Sun, Zibin Liu, Qifeng Yu, Yang Shang,
- Abstract summary: Camera calibration is a crucial step in photogrammetry and 3D vision applications.<n>This paper introduces a novel camera calibration method using a designed collimator system.
- Score: 18.010375923765913
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Camera calibration is a crucial step in photogrammetry and 3D vision applications. This paper introduces a novel camera calibration method using a designed collimator system. Our collimator system provides a reliable and controllable calibration environment for the camera. Exploiting the unique optical geometry property of our collimator system, we introduce an angle invariance constraint and further prove that the relative motion between the calibration target and camera conforms to a spherical motion model. This constraint reduces the original 6DOF relative motion between target and camera to a 3DOF pure rotation motion. Using spherical motion constraint, a closed-form linear solver for multiple images and a minimal solver for two images are proposed for camera calibration. Furthermore, we propose a single collimator image calibration algorithm based on the angle invariance constraint. This algorithm eliminates the requirement for camera motion, providing a novel solution for flexible and fast calibration. The performance of our method is evaluated in both synthetic and real-world experiments, which verify the feasibility of calibration using the collimator system and demonstrate that our method is superior to existing baseline methods. Demo code is available at https://github.com/LiangSK98/CollimatorCalibration
Related papers
- Generic Camera Calibration using Blurry Images [0.0]
Generic camera calibration can yield more accurate results than parametric cam era calibration.<n>We draw on geometric constraints and a local parametric illumination model to simultaneously estimate feature locations and spatially varying point spread functions.
arXiv Detail & Related papers (2026-03-05T13:29:05Z) - Collimator-assisted high-precision calibration method for event cameras [16.13632172944715]
Event cameras are a new type of brain-inspired visual sensor with advantages such as high dynamic range and high temporal resolution.<n>We propose an event camera calibration method utilizing a collimator with flickering star-based patterns.
arXiv Detail & Related papers (2025-12-18T02:16:22Z) - Dynamic View Synthesis from Small Camera Motion Videos [56.359460602781304]
We present a novel view synthesis for dynamic $3$D scenes based on distribution-based depth regularization.<n>We also introduce constraints that enforce the volume density of spatial points before the object boundary along the ray to be near zero, ensuring that our model learns the correct geometry of the scene.<n>We conduct extensive experiments to demonstrate the effectiveness of our approach in representing scenes with small camera motion input, and our results compare favorably to state-of-the-art methods.
arXiv Detail & Related papers (2025-06-29T09:17:55Z) - Camera Calibration using a Collimator System [5.138012450471437]
This paper introduces a novel camera calibration method using a collimator system.
Based on the optical geometry of the collimator system, we prove that the relative motion between the target and camera conforms to the spherical motion model.
A closed-form solver for multiple views and a minimal solver for two views are proposed for camera calibration.
arXiv Detail & Related papers (2024-09-30T07:40:41Z) - YOCO: You Only Calibrate Once for Accurate Extrinsic Parameter in LiDAR-Camera Systems [0.5999777817331317]
In a multi-sensor fusion system composed of cameras and LiDAR, precise extrinsic calibration contributes to the system's long-term stability and accurate perception of the environment.
This paper proposes a novel fully automatic extrinsic calibration method for LiDAR-camera systems that circumvents the need for corresponding point registration.
arXiv Detail & Related papers (2024-07-25T13:44:49Z) - EasyHeC: Accurate and Automatic Hand-eye Calibration via Differentiable
Rendering and Space Exploration [49.90228618894857]
We introduce a new approach to hand-eye calibration called EasyHeC, which is markerless, white-box, and delivers superior accuracy and robustness.
We propose to use two key technologies: differentiable rendering-based camera pose optimization and consistency-based joint space exploration.
Our evaluation demonstrates superior performance in synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-02T03:49:54Z) - Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter
Correction [54.00007868515432]
Existing methods face challenges in estimating the accurate correction field due to the uniform velocity assumption.
We propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixels.
Our method surpasses the state-of-the-art by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively.
arXiv Detail & Related papers (2023-03-31T15:09:18Z) - Online Marker-free Extrinsic Camera Calibration using Person Keypoint
Detections [25.393382192511716]
We propose a marker-free online method for the extrinsic calibration of multiple smart edge sensors.
Our method assumes the intrinsic camera parameters to be known and requires priming with a rough initial estimate of the camera poses.
We show that the calibration with our method achieves lower reprojection errors compared to a reference calibration generated by an offline method.
arXiv Detail & Related papers (2022-09-15T15:54:21Z) - Self-Calibrating Neural Radiance Fields [68.64327335620708]
We jointly learn the geometry of the scene and the accurate camera parameters without any calibration objects.
Our camera model consists of a pinhole model, a fourth order radial distortion, and a generic noise model that can learn arbitrary non-linear camera distortions.
arXiv Detail & Related papers (2021-08-31T13:34:28Z) - Dynamic Event Camera Calibration [27.852239869987947]
We present the first dynamic event camera calibration algorithm.
It calibrates directly from events captured during relative motion between camera and calibration pattern.
As demonstrated through our results, the obtained calibration method is highly convenient and reliably calibrates from data sequences spanning less than 10 seconds.
arXiv Detail & Related papers (2021-07-14T14:52:58Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Calibrated and Partially Calibrated Semi-Generalized Homographies [65.29477277713205]
We propose the first minimal solutions for estimating the semi-generalized homography given a perspective and a generalized camera.
The proposed solvers are stable and efficient as demonstrated by a number of synthetic and real-world experiments.
arXiv Detail & Related papers (2021-03-11T08:56:24Z) - Infrastructure-based Multi-Camera Calibration using Radial Projections [117.22654577367246]
Pattern-based calibration techniques can be used to calibrate the intrinsics of the cameras individually.
Infrastucture-based calibration techniques are able to estimate the extrinsics using 3D maps pre-built via SLAM or Structure-from-Motion.
We propose to fully calibrate a multi-camera system from scratch using an infrastructure-based approach.
arXiv Detail & Related papers (2020-07-30T09:21:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.