OSPC: Online Sequential Photometric Calibration
- URL: http://arxiv.org/abs/2305.17673v2
- Date: Thu, 6 Jul 2023 12:08:24 GMT
- Title: OSPC: Online Sequential Photometric Calibration
- Authors: Jawad Haidar, Douaa Khalil, Daniel Asmar
- Abstract summary: Photometric calibration is essential to many computer vision applications.
We propose a novel method that solves for photometric parameters using a sequential estimation approach.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Photometric calibration is essential to many computer vision applications.
One of its key benefits is enhancing the performance of Visual SLAM, especially
when it depends on a direct method for tracking, such as the standard KLT
algorithm. Another advantage could be in retrieving the sensor irradiance
values from measured intensities, as a pre-processing step for some vision
algorithms, such as shape-from-shading. Current photometric calibration systems
rely on a joint optimization problem and encounter an ambiguity in the
estimates, which can only be resolved using ground truth information. We
propose a novel method that solves for photometric parameters using a
sequential estimation approach. Our proposed method achieves high accuracy in
estimating all parameters; furthermore, the formulations are linear and convex,
which makes the solution fast and suitable for online applications. Experiments
on a Visual Odometry system validate the proposed method and demonstrate its
advantages.
Related papers
- Generalizable Non-Line-of-Sight Imaging with Learnable Physical Priors [52.195637608631955]
Non-line-of-sight (NLOS) imaging has attracted increasing attention due to its potential applications.
Existing NLOS reconstruction approaches are constrained by the reliance on empirical physical priors.
We introduce a novel learning-based solution, comprising two key designs: Learnable Path Compensation (LPC) and Adaptive Phasor Field (APF)
arXiv Detail & Related papers (2024-09-21T04:39:45Z) - YOCO: You Only Calibrate Once for Accurate Extrinsic Parameter in LiDAR-Camera Systems [0.5999777817331317]
In a multi-sensor fusion system composed of cameras and LiDAR, precise extrinsic calibration contributes to the system's long-term stability and accurate perception of the environment.
This paper proposes a novel fully automatic extrinsic calibration method for LiDAR-camera systems that circumvents the need for corresponding point registration.
arXiv Detail & Related papers (2024-07-25T13:44:49Z) - SPARE: Symmetrized Point-to-Plane Distance for Robust Non-Rigid Registration [76.40993825836222]
We propose SPARE, a novel formulation that utilizes a symmetrized point-to-plane distance for robust non-rigid registration.
The proposed method greatly improves the accuracy of non-rigid registration problems and maintains relatively high solution efficiency.
arXiv Detail & Related papers (2024-05-30T15:55:04Z) - Joint Spatial-Temporal Calibration for Camera and Global Pose Sensor [0.4143603294943439]
In robotics, motion capture systems have been widely used to measure the accuracy of localization algorithms.
These functionalities require having accurate and reliable spatial-temporal calibration parameters between the camera and the global pose sensor.
In this study, we provide two novel solutions to estimate these calibration parameters.
arXiv Detail & Related papers (2024-03-01T20:56:14Z) - Fixation-based Self-calibration for Eye Tracking in VR Headsets [0.21561701531034413]
The proposed method is based on the assumptions that the user's viewpoint can freely move.
fixations are first detected from the time-series data of uncalibrated gaze directions.
The calibration parameters are optimized by minimizing the sum of a dispersion metrics of the PoRs.
arXiv Detail & Related papers (2023-11-01T09:34:15Z) - An Adaptive Framework for Learning Unsupervised Depth Completion [59.17364202590475]
We present a method to infer a dense depth map from a color image and associated sparse depth measurements.
We show that regularization and co-visibility are related via the fitness of the model to data and can be unified into a single framework.
arXiv Detail & Related papers (2021-06-06T02:27:55Z) - Leveraging Spatial and Photometric Context for Calibrated Non-Lambertian
Photometric Stereo [61.6260594326246]
We introduce an efficient fully-convolutional architecture that can leverage both spatial and photometric context simultaneously.
Using separable 4D convolutions and 2D heat-maps reduces the size and makes more efficient.
arXiv Detail & Related papers (2021-03-22T18:06:58Z) - Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor
Setups [68.8204255655161]
We present a method to calibrate the parameters of any pair of sensors involving LiDARs, monocular or stereo cameras.
The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups.
arXiv Detail & Related papers (2021-01-12T12:02:26Z) - Pushing the Envelope of Rotation Averaging for Visual SLAM [69.7375052440794]
We propose a novel optimization backbone for visual SLAM systems.
We leverage averaging to improve the accuracy, efficiency and robustness of conventional monocular SLAM systems.
Our approach can exhibit up to 10x faster with comparable accuracy against the state-art on public benchmarks.
arXiv Detail & Related papers (2020-11-02T18:02:26Z) - SOIC: Semantic Online Initialization and Calibration for LiDAR and
Camera [18.51029962714994]
This paper presents a novel semantic-based online calibration approach, SOIC, for LiDAR and camera sensors.
We evaluate the proposed method either with GT or predicted on KITTI dataset.
arXiv Detail & Related papers (2020-03-09T17:02:31Z) - A Two-step Calibration Method for Unfocused Light Field Camera Based on
Projection Model Analysis [8.959346460518226]
The proposed method is able to reuse traditional camera calibration methods for the direction parameter set.
The accuracy and robustness of the proposed method outperforms its counterparts under various benchmark criteria.
arXiv Detail & Related papers (2020-01-11T10:37:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.