E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event
Cameras
- URL: http://arxiv.org/abs/2306.09078v1
- Date: Thu, 15 Jun 2023 12:16:38 GMT
- Title: E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event
Cameras
- Authors: Mohammed Salah, Abdulla Ayyad, Muhammad Humais, Daniel Gehrig,
Abdelqader Abusafieh, Lakmal Seneviratne, Davide Scaramuzza, and Yahya Zweiri
- Abstract summary: We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
- Score: 34.71767308204867
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras triggered a paradigm shift in the computer vision community
delineated by their asynchronous nature, low latency, and high dynamic range.
Calibration of event cameras is always essential to account for the sensor
intrinsic parameters and for 3D perception. However, conventional image-based
calibration techniques are not applicable due to the asynchronous, binary
output of the sensor. The current standard for calibrating event cameras relies
on either blinking patterns or event-based image reconstruction algorithms.
These approaches are difficult to deploy in factory settings and are affected
by noise and artifacts degrading the calibration performance. To bridge these
limitations, we present E-Calib, a novel, fast, robust, and accurate
calibration toolbox for event cameras utilizing the asymmetric circle grid, for
its robustness to out-of-focus scenes. The proposed method is tested in a
variety of rigorous experiments for different event camera models, on circle
grids with different geometric properties, and under challenging illumination
conditions. The results show that our approach outperforms the state-of-the-art
in detection success rate, reprojection error, and estimation accuracy of
extrinsic parameters.
Related papers
- EF-Calib: Spatiotemporal Calibration of Event- and Frame-Based Cameras Using Continuous-Time Trajectories [10.338905475270746]
Event camera offers promising prospects for fusion with frame-based cameras.
We present EF-Calib, a framework for calibrating stereo vision systems that incorporate both intrinsic and frame-based cameras.
arXiv Detail & Related papers (2024-05-27T15:40:24Z) - EasyHeC: Accurate and Automatic Hand-eye Calibration via Differentiable
Rendering and Space Exploration [49.90228618894857]
We introduce a new approach to hand-eye calibration called EasyHeC, which is markerless, white-box, and delivers superior accuracy and robustness.
We propose to use two key technologies: differentiable rendering-based camera pose optimization and consistency-based joint space exploration.
Our evaluation demonstrates superior performance in synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-02T03:49:54Z) - Deep Learning for Camera Calibration and Beyond: A Survey [100.75060862015945]
Camera calibration involves estimating camera parameters to infer geometric features from captured sequences.
Recent efforts show that learning-based solutions have the potential to be used in place of the repeatability works of manual calibrations.
arXiv Detail & Related papers (2023-03-19T04:00:05Z) - A Deep Perceptual Measure for Lens and Camera Calibration [35.03926427249506]
In place of the traditional multi-image calibration process, we propose to infer the camera calibration parameters directly from a single image.
We train this network using automatically generated samples from a large-scale panorama dataset.
We conduct a large-scale human perception study where we ask participants to judge the realism of 3D objects composited with correct and biased camera calibration parameters.
arXiv Detail & Related papers (2022-08-25T18:40:45Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - ESL: Event-based Structured Light [62.77144631509817]
Event cameras are bio-inspired sensors providing significant advantages over standard cameras.
We propose a novel structured-light system using an event camera to tackle the problem of accurate and high-speed depth sensing.
arXiv Detail & Related papers (2021-11-30T15:47:39Z) - Dynamic Event Camera Calibration [27.852239869987947]
We present the first dynamic event camera calibration algorithm.
It calibrates directly from events captured during relative motion between camera and calibration pattern.
As demonstrated through our results, the obtained calibration method is highly convenient and reliably calibrates from data sequences spanning less than 10 seconds.
arXiv Detail & Related papers (2021-07-14T14:52:58Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z) - Superaccurate Camera Calibration via Inverse Rendering [0.19336815376402716]
We propose a new method for camera calibration using the principle of inverse rendering.
Instead of relying solely on detected feature points, we use an estimate of the internal parameters and the pose of the calibration object to implicitly render a non-photorealistic equivalent of the optical features.
arXiv Detail & Related papers (2020-03-20T10:26:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.