Hand-Eye Calibration
- URL: http://arxiv.org/abs/2311.12655v2
- Date: Wed, 22 Nov 2023 09:00:02 GMT
- Title: Hand-Eye Calibration
- Authors: Radu Horaud and Fadi Dornaika
- Abstract summary: We show two possible formulations of the hand-eye calibration problem.
We develop a common mathematical framework to solve for the hand-eye calibration problem.
We present two methods, (i) a rotation then translation and (ii) a non-linear solver for rotation and translation.
- Score: 23.28928112775409
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Whenever a sensor is mounted on a robot hand it is important to know the
relationship between the sensor and the hand. The problem of determining this
relationship is referred to as hand-eye calibration, which is important in at
least two types of tasks: (i) map sensor centered measurements into the robot
workspace and (ii) allow the robot to precisely move the sensor. In the past
some solutions were proposed in the particular case of a camera. With almost no
exception, all existing solutions attempt to solve the homogeneous matrix
equation AX=XB. First we show that there are two possible formulations of the
hand-eye calibration problem. One formulation is the classical one that we just
mentioned. A second formulation takes the form of the following homogeneous
matrix equation: MY=M'YB. The advantage of the latter is that the extrinsic and
intrinsic camera parameters need not be made explicit. Indeed, this formulation
directly uses the 3 by 4 perspective matrices (M and M') associated with two
positions of the camera. Moreover, this formulation together with the classical
one cover a wider range of camera-based sensors to be calibrated with respect
to the robot hand. Second, we develop a common mathematical framework to solve
for the hand-eye calibration problem using either of the two formulations. We
present two methods, (i) a rotation then translation and (ii) a non-linear
solver for rotation and translation. Third, we perform a stability analysis
both for our two methods and for the classical linear method of Tsai and Lenz
(1989). In the light of this comparison, the non-linear optimization method,
that solves for rotation and translation simultaneously, seems to be the most
robust one with respect to noise and to measurement errors.
Related papers
- Camera Calibration using a Collimator System [5.138012450471437]
This paper introduces a novel camera calibration method using a collimator system.
Based on the optical geometry of the collimator system, we prove that the relative motion between the target and camera conforms to the spherical motion model.
A closed-form solver for multiple views and a minimal solver for two views are proposed for camera calibration.
arXiv Detail & Related papers (2024-09-30T07:40:41Z) - Robot Hand-Eye Calibration using Structure-from-Motion [9.64487611393378]
We propose a new flexible method for hand-eye calibration.
We show that the solution can be obtained in linear form.
We conduct a large number of experiments which validate the quality of the method by comparing it with existing ones.
arXiv Detail & Related papers (2023-11-20T14:41:44Z) - Vanishing Point Estimation in Uncalibrated Images with Prior Gravity
Direction [82.72686460985297]
We tackle the problem of estimating a Manhattan frame.
We derive two new 2-line solvers, one of which does not suffer from singularities affecting existing solvers.
We also design a new non-minimal method, running on an arbitrary number of lines, to boost the performance in local optimization.
arXiv Detail & Related papers (2023-08-21T13:03:25Z) - EasyHeC: Accurate and Automatic Hand-eye Calibration via Differentiable
Rendering and Space Exploration [49.90228618894857]
We introduce a new approach to hand-eye calibration called EasyHeC, which is markerless, white-box, and delivers superior accuracy and robustness.
We propose to use two key technologies: differentiable rendering-based camera pose optimization and consistency-based joint space exploration.
Our evaluation demonstrates superior performance in synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-02T03:49:54Z) - Partially calibrated semi-generalized pose from hybrid point
correspondences [68.22708881161049]
We study all possible camera configurations within the generalized camera system.
To derive practical solvers, we test different parameterizations as well as different solving strategies.
We show that in the presence of noise in the 3D points these solvers provide better estimates than the corresponding absolute pose solvers.
arXiv Detail & Related papers (2022-09-29T19:46:59Z) - Online Marker-free Extrinsic Camera Calibration using Person Keypoint
Detections [25.393382192511716]
We propose a marker-free online method for the extrinsic calibration of multiple smart edge sensors.
Our method assumes the intrinsic camera parameters to be known and requires priming with a rough initial estimate of the camera poses.
We show that the calibration with our method achieves lower reprojection errors compared to a reference calibration generated by an offline method.
arXiv Detail & Related papers (2022-09-15T15:54:21Z) - Calibrated and Partially Calibrated Semi-Generalized Homographies [65.29477277713205]
We propose the first minimal solutions for estimating the semi-generalized homography given a perspective and a generalized camera.
The proposed solvers are stable and efficient as demonstrated by a number of synthetic and real-world experiments.
arXiv Detail & Related papers (2021-03-11T08:56:24Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Infrastructure-based Multi-Camera Calibration using Radial Projections [117.22654577367246]
Pattern-based calibration techniques can be used to calibrate the intrinsics of the cameras individually.
Infrastucture-based calibration techniques are able to estimate the extrinsics using 3D maps pre-built via SLAM or Structure-from-Motion.
We propose to fully calibrate a multi-camera system from scratch using an infrastructure-based approach.
arXiv Detail & Related papers (2020-07-30T09:21:04Z) - Continuous hand-eye calibration using 3D points [0.0]
We show that a simple closed-form solution with a shifted focus towards the equation of translation only solves for the necessary hand-eye transformation.
We show that it is superior in accuracy and robustness compared to traditional approaches.
Second, we decrease the dependency on the calibration object to a single 3D-point by using a similar formulation based on the equation of translation.
arXiv Detail & Related papers (2020-04-27T07:13:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.