eWand: A calibration framework for wide baseline frame-based and event-based camera systems
- URL: http://arxiv.org/abs/2309.12685v2
- Date: Wed, 3 Apr 2024 12:47:15 GMT
- Title: eWand: A calibration framework for wide baseline frame-based and event-based camera systems
- Authors: Thomas Gossard, Andreas Ziegler, Levin Kolmar, Jonas Tebbe, Andreas Zell,
- Abstract summary: We propose eWand, a new method that uses blinking LEDs inside opaque spheres instead of a printed or displayed pattern.
Our method provides a faster, easier-to-use extrinsic calibration approach that maintains high accuracy for both event- and frame-based cameras.
- Score: 11.735290341808064
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate calibration is crucial for using multiple cameras to triangulate the position of objects precisely. However, it is also a time-consuming process that needs to be repeated for every displacement of the cameras. The standard approach is to use a printed pattern with known geometry to estimate the intrinsic and extrinsic parameters of the cameras. The same idea can be applied to event-based cameras, though it requires extra work. By using frame reconstruction from events, a printed pattern can be detected. A blinking pattern can also be displayed on a screen. Then, the pattern can be directly detected from the events. Such calibration methods can provide accurate intrinsic calibration for both frame- and event-based cameras. However, using 2D patterns has several limitations for multi-camera extrinsic calibration, with cameras possessing highly different points of view and a wide baseline. The 2D pattern can only be detected from one direction and needs to be of significant size to compensate for its distance to the camera. This makes the extrinsic calibration time-consuming and cumbersome. To overcome these limitations, we propose eWand, a new method that uses blinking LEDs inside opaque spheres instead of a printed or displayed pattern. Our method provides a faster, easier-to-use extrinsic calibration approach that maintains high accuracy for both event- and frame-based cameras.
Related papers
- Neural Real-Time Recalibration for Infrared Multi-Camera Systems [2.249916681499244]
There are no learning-free or neural techniques for real-time recalibration of infrared multi-camera systems.
We propose a neural network-based method capable of dynamic real-time calibration.
arXiv Detail & Related papers (2024-10-18T14:37:37Z) - Deep Learning for Camera Calibration and Beyond: A Survey [100.75060862015945]
Camera calibration involves estimating camera parameters to infer geometric features from captured sequences.
Recent efforts show that learning-based solutions have the potential to be used in place of the repeatability works of manual calibrations.
arXiv Detail & Related papers (2023-03-19T04:00:05Z) - Online Marker-free Extrinsic Camera Calibration using Person Keypoint
Detections [25.393382192511716]
We propose a marker-free online method for the extrinsic calibration of multiple smart edge sensors.
Our method assumes the intrinsic camera parameters to be known and requires priming with a rough initial estimate of the camera poses.
We show that the calibration with our method achieves lower reprojection errors compared to a reference calibration generated by an offline method.
arXiv Detail & Related papers (2022-09-15T15:54:21Z) - Self-Calibrating Neural Radiance Fields [68.64327335620708]
We jointly learn the geometry of the scene and the accurate camera parameters without any calibration objects.
Our camera model consists of a pinhole model, a fourth order radial distortion, and a generic noise model that can learn arbitrary non-linear camera distortions.
arXiv Detail & Related papers (2021-08-31T13:34:28Z) - MetaPose: Fast 3D Pose from Multiple Views without 3D Supervision [72.5863451123577]
We show how to train a neural model that can perform accurate 3D pose and camera estimation.
Our method outperforms both classical bundle adjustment and weakly-supervised monocular 3D baselines.
arXiv Detail & Related papers (2021-08-10T18:39:56Z) - Dynamic Event Camera Calibration [27.852239869987947]
We present the first dynamic event camera calibration algorithm.
It calibrates directly from events captured during relative motion between camera and calibration pattern.
As demonstrated through our results, the obtained calibration method is highly convenient and reliably calibrates from data sequences spanning less than 10 seconds.
arXiv Detail & Related papers (2021-07-14T14:52:58Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Infrastructure-based Multi-Camera Calibration using Radial Projections [117.22654577367246]
Pattern-based calibration techniques can be used to calibrate the intrinsics of the cameras individually.
Infrastucture-based calibration techniques are able to estimate the extrinsics using 3D maps pre-built via SLAM or Structure-from-Motion.
We propose to fully calibrate a multi-camera system from scratch using an infrastructure-based approach.
arXiv Detail & Related papers (2020-07-30T09:21:04Z) - Lightweight Multi-View 3D Pose Estimation through Camera-Disentangled
Representation [57.11299763566534]
We present a solution to recover 3D pose from multi-view images captured with spatially calibrated cameras.
We exploit 3D geometry to fuse input images into a unified latent representation of pose, which is disentangled from camera view-points.
Our architecture then conditions the learned representation on camera projection operators to produce accurate per-view 2d detections.
arXiv Detail & Related papers (2020-04-05T12:52:29Z) - Multi-camera calibration with pattern rigs, including for non-overlapping cameras: CALICO [1.8874301050354767]
This paper describes CALICO, a method for multi-camera calibration suitable for challenging contexts.
CalICO is a pattern-based approach, where the multi-calibration problem is formulated using rigidity constraints between patterns and cameras.
arXiv Detail & Related papers (2019-03-15T21:35:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.