MBA-VO: Motion Blur Aware Visual Odometry
- URL: http://arxiv.org/abs/2103.13684v1
- Date: Thu, 25 Mar 2021 09:02:56 GMT
- Title: MBA-VO: Motion Blur Aware Visual Odometry
- Authors: Peidong Liu, Xingxing Zuo, Viktor Larsson and Marc Pollefeys
- Abstract summary: Motion blur is one of the major challenges remaining for visual odometry methods.
In low-light conditions where longer exposure times are necessary, motion blur can appear even for relatively slow camera motions.
We present a novel hybrid visual odometry pipeline with direct approach that explicitly models and estimates the camera's local trajectory within the exposure time.
- Score: 99.56896875807635
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Motion blur is one of the major challenges remaining for visual odometry
methods. In low-light conditions where longer exposure times are necessary,
motion blur can appear even for relatively slow camera motions. In this paper
we present a novel hybrid visual odometry pipeline with direct approach that
explicitly models and estimates the camera's local trajectory within the
exposure time. This allows us to actively compensate for any motion blur that
occurs due to the camera motion. In addition, we also contribute a novel
benchmarking dataset for motion blur aware visual odometry. In experiments we
show that by directly modeling the image formation process, we are able to
improve robustness of the visual odometry, while keeping comparable accuracy as
that for images without motion blur.
Related papers
- Virtual Inverse Perspective Mapping for Simultaneous Pose and Motion
Estimation [5.199765487172328]
We propose an automatic method for pose and motion estimation against a ground surface for a ground-moving robot-mounted monocular camera.
The framework adopts a semi-dense approach that benefits from both a feature-based method and an image-registration-based method.
arXiv Detail & Related papers (2023-03-09T11:45:00Z) - Event-aided Direct Sparse Odometry [54.602311491827805]
We introduce EDS, a direct monocular visual odometry using events and frames.
Our algorithm leverages the event generation model to track the camera motion in the blind time between frames.
EDS is the first method to perform 6-DOF VO using events and frames with a direct approach.
arXiv Detail & Related papers (2022-04-15T20:40:29Z) - Learning Spatially Varying Pixel Exposures for Motion Deblurring [49.07867902677453]
We present a novel approach of leveraging spatially varying pixel exposures for motion deblurring.
Our work illustrates the promising role that focal-plane sensor--processors can play in the future of computational imaging.
arXiv Detail & Related papers (2022-04-14T23:41:49Z) - Motion-from-Blur: 3D Shape and Motion Estimation of Motion-blurred
Objects in Videos [115.71874459429381]
We propose a method for jointly estimating the 3D motion, 3D shape, and appearance of highly motion-blurred objects from a video.
Experiments on benchmark datasets demonstrate that our method outperforms previous methods for fast moving object deblurring and 3D reconstruction.
arXiv Detail & Related papers (2021-11-29T11:25:14Z) - Spatiotemporal Registration for Event-based Visual Odometry [40.02502611087858]
A useful application of event sensing is visual odometry, especially in settings that require high-temporal resolution.
We propose large registration as a compelling technique for event-based rotational motion estimation.
We also contribute a new event dataset for visual odometry, where motion sequences with large velocity variations were acquired using a high-precision robot arm.
arXiv Detail & Related papers (2021-03-10T09:23:24Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z) - Exposure Trajectory Recovery from Motion Blur [90.75092808213371]
Motion blur in dynamic scenes is an important yet challenging research topic.
In this paper, we define exposure trajectories, which represent the motion information contained in a blurry image.
A novel motion offset estimation framework is proposed to model pixel-wise displacements of the latent sharp image.
arXiv Detail & Related papers (2020-10-06T05:23:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.