Continuous-Time Spatiotemporal Calibration of a Rolling Shutter
Camera---IMU System
- URL: http://arxiv.org/abs/2108.07200v1
- Date: Mon, 16 Aug 2021 16:09:22 GMT
- Title: Continuous-Time Spatiotemporal Calibration of a Rolling Shutter
Camera---IMU System
- Authors: Jianzhu Huai, Yuan Zhuang, Qicheng Yuan, Yukai Lin
- Abstract summary: The shutter (RS) mechanism is widely used by consumer-grade cameras, which are essential parts in smartphones and autonomous vehicles.
This work takes the camera-IMU system as an example and looks into the RS effect on itstemporal calibration.
- Score: 8.201100713224003
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The rolling shutter (RS) mechanism is widely used by consumer-grade cameras,
which are essential parts in smartphones and autonomous vehicles. The RS effect
leads to image distortion upon relative motion between a camera and the scene.
This effect needs to be considered in video stabilization, structure from
motion, and vision-aided odometry, for which recent studies have improved
earlier global shutter (GS) methods by accounting for the RS effect. However,
it is still unclear how the RS affects spatiotemporal calibration of the camera
in a sensor assembly, which is crucial to good performance in aforementioned
applications.
This work takes the camera-IMU system as an example and looks into the RS
effect on its spatiotemporal calibration. To this end, we develop a calibration
method for a RS-camera-IMU system with continuous-time B-splines by using a
calibration target. Unlike in calibrating GS cameras, every observation of a
landmark on the target has a unique camera pose fitted by continuous-time
B-splines. With simulated data generated from four sets of public calibration
data, we show that RS can noticeably affect the extrinsic parameters, causing
errors about 1$^\circ$ in orientation and 2 $cm$ in translation with a RS
setting as in common smartphone cameras. With real data collected by two
industrial camera-IMU systems, we find that considering the RS effect gives
more accurate and consistent spatiotemporal calibration. Moreover, our method
also accurately calibrates the inter-line delay of the RS. The code for
simulation and calibration is publicly available.
Related papers
- EF-Calib: Spatiotemporal Calibration of Event- and Frame-Based Cameras Using Continuous-Time Trajectories [10.338905475270746]
Event camera offers promising prospects for fusion with frame-based cameras.
We present EF-Calib, a framework for calibrating stereo vision systems that incorporate both intrinsic and frame-based cameras.
arXiv Detail & Related papers (2024-05-27T15:40:24Z) - E-Calib: A Fast, Robust and Accurate Calibration Toolbox for Event Cameras [18.54225086007182]
We present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras.
The proposed method is tested in a variety of rigorous experiments for different event camera models.
arXiv Detail & Related papers (2023-06-15T12:16:38Z) - Rolling Shutter Inversion: Bring Rolling Shutter Images to High
Framerate Global Shutter Video [111.08121952640766]
This paper presents a novel deep-learning based solution to the RS temporal super-resolution problem.
By leveraging the multi-view geometry relationship of the RS imaging process, our framework successfully achieves high framerate GS generation.
Our method can produce high-quality GS image sequences with rich details, outperforming the state-of-the-art methods.
arXiv Detail & Related papers (2022-10-06T16:47:12Z) - Learning Adaptive Warping for Real-World Rolling Shutter Correction [52.45689075940234]
This paper proposes the first real-world rolling shutter (RS) correction dataset, BS-RSC, and a corresponding model to correct the RS frames in a distorted video.
Mobile devices in the consumer market with CMOS-based sensors for video capture often result in rolling shutter effects when relative movements occur during the video acquisition process.
arXiv Detail & Related papers (2022-04-29T05:13:50Z) - Bringing Rolling Shutter Images Alive with Dual Reversed Distortion [75.78003680510193]
Rolling shutter (RS) distortion can be interpreted as the result of picking a row of pixels from instant global shutter (GS) frames over time.
We develop a novel end-to-end model, IFED, to generate dual optical flow sequence through iterative learning of the velocity field during the RS time.
arXiv Detail & Related papers (2022-03-12T14:57:49Z) - How to Calibrate Your Event Camera [58.80418612800161]
We propose a generic event camera calibration framework using image reconstruction.
We show that neural-network-based image reconstruction is well suited for the task of intrinsic and extrinsic calibration of event cameras.
arXiv Detail & Related papers (2021-05-26T07:06:58Z) - CoMo: A novel co-moving 3D camera system [0.0]
CoMo is a co-moving camera system of two synchronized high speed cameras coupled with rotational stages.
We address the calibration of the external parameters measuring the position of the cameras and their three angles of yaw, pitch and roll in the system "home" configuration.
We evaluate the robustness and accuracy of the system by comparing reconstructed and measured 3D distances in what we call 3D tests, which show a relative error of the order of 1%.
arXiv Detail & Related papers (2021-01-26T13:29:13Z) - Infrastructure-based Multi-Camera Calibration using Radial Projections [117.22654577367246]
Pattern-based calibration techniques can be used to calibrate the intrinsics of the cameras individually.
Infrastucture-based calibration techniques are able to estimate the extrinsics using 3D maps pre-built via SLAM or Structure-from-Motion.
We propose to fully calibrate a multi-camera system from scratch using an infrastructure-based approach.
arXiv Detail & Related papers (2020-07-30T09:21:04Z) - Learning Camera Miscalibration Detection [83.38916296044394]
This paper focuses on a data-driven approach to learn the detection of miscalibration in vision sensors, specifically RGB cameras.
Our contributions include a proposed miscalibration metric for RGB cameras and a novel semi-synthetic dataset generation pipeline based on this metric.
By training a deep convolutional neural network, we demonstrate the effectiveness of our pipeline to identify whether a recalibration of the camera's intrinsic parameters is required or not.
arXiv Detail & Related papers (2020-05-24T10:32:49Z) - Superaccurate Camera Calibration via Inverse Rendering [0.19336815376402716]
We propose a new method for camera calibration using the principle of inverse rendering.
Instead of relying solely on detected feature points, we use an estimate of the internal parameters and the pose of the calibration object to implicitly render a non-photorealistic equivalent of the optical features.
arXiv Detail & Related papers (2020-03-20T10:26:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.