The Influence of Autofocus Lenses in the Camera Calibration Process
- URL: http://arxiv.org/abs/2402.04686v1
- Date: Wed, 7 Feb 2024 09:20:01 GMT
- Title: The Influence of Autofocus Lenses in the Camera Calibration Process
- Authors: Carlos Ricolfe-Viala, Alicia Esparza
- Abstract summary: Camera calibration process consists of adjusting a set of data to a pin-hole model.
Pin-hole model does not represent the camera behavior accurately if the focus is considered.
A pin-hole model with distance dependent focal length is proposed to improve the calibration process substantially.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Camera calibration is a crucial step in robotics and computer vision.
Accurate camera parameters are necessary to achieve robust applications.
Nowadays, camera calibration process consists of adjusting a set of data to a
pin-hole model, assuming that with a reprojection error close to cero, camera
parameters are correct. Since all camera parameters are unknown, computed
results are considered true. However, the pin-hole model does not represent the
camera behavior accurately if the focus is considered. Real cameras change the
focal length slightly to obtain sharp objects in the image and this feature
skews the calibration result if a unique pin-hole model is computed with a
constant focal length. In this paper, a deep analysis of the camera calibration
process is done to detect and strengthen its weaknesses. The camera is mounted
in a robot arm to known extrinsic camera parameters with accuracy and to be
able to compare computed results with the true ones. Based on the bias that
exist between computed results and the true ones, a modification of the widely
accepted camera calibration method using images of a planar template is
presented. A pin-hole model with distance dependent focal length is proposed to
improve the calibration process substantially
Related papers
- Single-image camera calibration with model-free distortion correction [0.0]
This paper proposes a method for estimating the complete set of calibration parameters from a single image of a planar speckle pattern covering the entire sensor.
The correspondence between image points and physical points on the calibration target is obtained using Digital Image Correlation.
At the end of the procedure, a dense and uniform model-free distortion map is obtained over the entire image.
arXiv Detail & Related papers (2024-03-02T16:51:35Z) - E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event
Cameras [34.71767308204867]
We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
arXiv Detail & Related papers (2023-06-15T12:16:38Z) - Deep Learning for Camera Calibration and Beyond: A Survey [100.75060862015945]
Camera calibration involves estimating camera parameters to infer geometric features from captured sequences.
Recent efforts show that learning-based solutions have the potential to be used in place of the repeatability works of manual calibrations.
arXiv Detail & Related papers (2023-03-19T04:00:05Z) - An Adaptive Method for Camera Attribution under Complex Radial
Distortion Corrections [77.34726150561087]
In-camera or out-camera software/firmware alters the supporting grid of the image so as to hamper PRNU-based camera attribution.
Existing solutions to deal with this problem try to invert/estimate the correction using radial transformations parameterized with few variables in order to restrain the computational load.
We propose an adaptive algorithm that by dividing the image into concentric annuli is able to deal with sophisticated corrections like those applied out-camera by third party software like Adobe Lightroom, Photoshop, Gimp and PT-Lens.
arXiv Detail & Related papers (2023-02-28T08:44:00Z) - Extrinsic Camera Calibration with Semantic Segmentation [60.330549990863624]
We present an extrinsic camera calibration approach that automatizes the parameter estimation by utilizing semantic segmentation information.
Our approach relies on a coarse initial measurement of the camera pose and builds on lidar sensors mounted on a vehicle.
We evaluate our method on simulated and real-world data to demonstrate low error measurements in the calibration results.
arXiv Detail & Related papers (2022-08-08T07:25:03Z) - Self-Supervised Camera Self-Calibration from Video [34.35533943247917]
We propose a learning algorithm to regress per-sequence calibration parameters using an efficient family of general camera models.
Our procedure achieves self-calibration results with sub-pixel reprojection error, outperforming other learning-based methods.
arXiv Detail & Related papers (2021-12-06T19:42:05Z) - BabelCalib: A Universal Approach to Calibrating Central Cameras [24.662262051346087]
Existing calibration methods occasionally fail for large field-of-view cameras.
We propose a solver to calibrate the parameters in terms of a back-projection model and then regress the parameters for a target forward model.
These steps are incorporated in a robust estimation framework to cope with outlying detections.
arXiv Detail & Related papers (2021-09-20T17:21:57Z) - Self-Calibrating Neural Radiance Fields [68.64327335620708]
We jointly learn the geometry of the scene and the accurate camera parameters without any calibration objects.
Our camera model consists of a pinhole model, a fourth order radial distortion, and a generic noise model that can learn arbitrary non-linear camera distortions.
arXiv Detail & Related papers (2021-08-31T13:34:28Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Infrastructure-based Multi-Camera Calibration using Radial Projections [117.22654577367246]
Pattern-based calibration techniques can be used to calibrate the intrinsics of the cameras individually.
Infrastucture-based calibration techniques are able to estimate the extrinsics using 3D maps pre-built via SLAM or Structure-from-Motion.
We propose to fully calibrate a multi-camera system from scratch using an infrastructure-based approach.
arXiv Detail & Related papers (2020-07-30T09:21:04Z) - Superaccurate Camera Calibration via Inverse Rendering [0.19336815376402716]
We propose a new method for camera calibration using the principle of inverse rendering.
Instead of relying solely on detected feature points, we use an estimate of the internal parameters and the pose of the calibration object to implicitly render a non-photorealistic equivalent of the optical features.
arXiv Detail & Related papers (2020-03-20T10:26:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.