EF-Calib: Spatiotemporal Calibration of Event- and Frame-Based Cameras Using Continuous-Time Trajectories
- URL: http://arxiv.org/abs/2405.17278v2
- Date: Wed, 25 Sep 2024 03:59:55 GMT
- Title: EF-Calib: Spatiotemporal Calibration of Event- and Frame-Based Cameras Using Continuous-Time Trajectories
- Authors: Shaoan Wang, Zhanhua Xin, Yaoqing Hu, Dongyue Li, Mingzhu Zhu, Junzhi Yu,
- Abstract summary: Event camera offers promising prospects for fusion with frame-based cameras.
We present EF-Calib, a framework for calibrating stereo vision systems that incorporate both intrinsic and frame-based cameras.
- Score: 10.338905475270746
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event camera, a bio-inspired asynchronous triggered camera, offers promising prospects for fusion with frame-based cameras owing to its low latency and high dynamic range. However, calibrating stereo vision systems that incorporate both event and frame-based cameras remains a significant challenge. In this letter, we present EF-Calib, a spatiotemporal calibration framework for event- and frame-based cameras using continuous-time trajectories. A novel calibration pattern applicable to both camera types and the corresponding event recognition algorithm is proposed. Leveraging the asynchronous nature of events, a derivable piece-wise B-spline to represent camera pose continuously is introduced, enabling calibration for intrinsic parameters, extrinsic parameters, and time offset, with analytical Jacobians provided. Various experiments are carried out to evaluate the calibration performance of EF-Calib, including calibration experiments for intrinsic parameters, extrinsic parameters, and time offset. Experimental results show that EF-Calib achieves the most accurate intrinsic parameters compared to current SOTA, the close accuracy of the extrinsic parameters compared to the frame-based results, and accurate time offset estimation. EF-Calib provides a convenient and accurate toolbox for calibrating the system that fuses events and frames. The code of this paper will also be open-sourced at: https://github.com/wsakobe/EF-Calib.
Related papers
- E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event Cameras [18.54225086007182]
We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
arXiv Detail & Related papers (2023-06-15T12:16:38Z) - GPU-accelerated SIFT-aided source identification of stabilized videos [63.084540168532065]
We exploit the parallelization capabilities of Graphics Processing Units (GPUs) in the framework of stabilised frames inversion.
We propose to exploit SIFT features.
to estimate the camera momentum and %to identify less stabilized temporal segments.
Experiments confirm the effectiveness of the proposed approach in reducing the required computational time and improving the source identification accuracy.
arXiv Detail & Related papers (2022-07-29T07:01:31Z) - SST-Calib: Simultaneous Spatial-Temporal Parameter Calibration between
LIDAR and Camera [26.59231069298659]
A segmentation-based framework is proposed to jointly estimate the geometrical and temporal parameters in the calibration of a camera-LIDAR suite.
The proposed algorithm is tested on the KITTI dataset, and the result shows an accurate real-time calibration of both geometric and temporal parameters.
arXiv Detail & Related papers (2022-07-08T06:21:52Z) - Learning Adaptive Warping for Real-World Rolling Shutter Correction [52.45689075940234]
This paper proposes the first real-world rolling shutter (RS) correction dataset, BS-RSC, and a corresponding model to correct the RS frames in a distorted video.
Mobile devices in the consumer market with CMOS-based sensors for video capture often result in rolling shutter effects when relative movements occur during the video acquisition process.
arXiv Detail & Related papers (2022-04-29T05:13:50Z) - Bringing Rolling Shutter Images Alive with Dual Reversed Distortion [75.78003680510193]
Rolling shutter (RS) distortion can be interpreted as the result of picking a row of pixels from instant global shutter (GS) frames over time.
We develop a novel end-to-end model, IFED, to generate dual optical flow sequence through iterative learning of the velocity field during the RS time.
arXiv Detail & Related papers (2022-03-12T14:57:49Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Continuous-Time Spatiotemporal Calibration of a Rolling Shutter
Camera---IMU System [8.201100713224003]
The shutter (RS) mechanism is widely used by consumer-grade cameras, which are essential parts in smartphones and autonomous vehicles.
This work takes the camera-IMU system as an example and looks into the RS effect on itstemporal calibration.
arXiv Detail & Related papers (2021-08-16T16:09:22Z) - Dynamic Event Camera Calibration [27.852239869987947]
We present the first dynamic event camera calibration algorithm.
It calibrates directly from events captured during relative motion between camera and calibration pattern.
As demonstrated through our results, the obtained calibration method is highly convenient and reliably calibrates from data sequences spanning less than 10 seconds.
arXiv Detail & Related papers (2021-07-14T14:52:58Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - Self-Calibration Supported Robust Projective Structure-from-Motion [80.15392629310507]
We propose a unified Structure-from-Motion (SfM) method, in which the matching process is supported by self-calibration constraints.
We show experimental results demonstrating robust multiview matching and accurate camera calibration by exploiting these constraints.
arXiv Detail & Related papers (2020-07-04T08:47:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.