Spatiotemporal Registration for Event-based Visual Odometry
- URL: http://arxiv.org/abs/2103.05955v1
- Date: Wed, 10 Mar 2021 09:23:24 GMT
- Title: Spatiotemporal Registration for Event-based Visual Odometry
- Authors: Daqi Liu Alvaro Parra and Tat-Jun Chin
- Abstract summary: A useful application of event sensing is visual odometry, especially in settings that require high-temporal resolution.
We propose large registration as a compelling technique for event-based rotational motion estimation.
We also contribute a new event dataset for visual odometry, where motion sequences with large velocity variations were acquired using a high-precision robot arm.
- Score: 40.02502611087858
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A useful application of event sensing is visual odometry, especially in
settings that require high-temporal resolution. The state-of-the-art method of
contrast maximisation recovers the motion from a batch of events by maximising
the contrast of the image of warped events. However, the cost scales with image
resolution and the temporal resolution can be limited by the need for large
batch sizes to yield sufficient structure in the contrast image. In this work,
we propose spatiotemporal registration as a compelling technique for
event-based rotational motion estimation. We theoretcally justify the approach
and establish its fundamental and practical advantages over contrast
maximisation. In particular, spatiotemporal registration also produces feature
tracks as a by-product, which directly supports an efficient visual odometry
pipeline with graph-based optimisation for motion averaging. The simplicity of
our visual odometry pipeline allows it to process more than 1 M events/second.
We also contribute a new event dataset for visual odometry, where motion
sequences with large velocity variations were acquired using a high-precision
robot arm.
Related papers
- ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping sub-problems in parallel.
We build an event-based stereo visual-inertial odometry system on top of our previous direct pipeline Event-based Stereo Visual Odometry.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Recovering Continuous Scene Dynamics from A Single Blurry Image with
Events [58.7185835546638]
An Implicit Video Function (IVF) is learned to represent a single motion blurred image with concurrent events.
A dual attention transformer is proposed to efficiently leverage merits from both modalities.
The proposed network is trained only with the supervision of ground-truth images of limited referenced timestamps.
arXiv Detail & Related papers (2023-04-05T18:44:17Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Visual Odometry with an Event Camera Using Continuous Ray Warping and
Volumetric Contrast Maximization [31.627936023222052]
We present a new solution to tracking and mapping with an event camera.
The motion of the camera contains both rotation and translation, and the displacements happen in an arbitrarily structured environment.
We introduce a new solution to this problem by performing contrast in 3D.
The practical validity of our approach is supported by an application to AGV motion estimation and 3D reconstruction with a single vehicle-mounted event camera.
arXiv Detail & Related papers (2021-07-07T04:32:57Z) - MBA-VO: Motion Blur Aware Visual Odometry [99.56896875807635]
Motion blur is one of the major challenges remaining for visual odometry methods.
In low-light conditions where longer exposure times are necessary, motion blur can appear even for relatively slow camera motions.
We present a novel hybrid visual odometry pipeline with direct approach that explicitly models and estimates the camera's local trajectory within the exposure time.
arXiv Detail & Related papers (2021-03-25T09:02:56Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Single Image Optical Flow Estimation with an Event Camera [38.92408855196647]
Event cameras are bio-inspired sensors that report intensity changes in microsecond resolution.
We propose a single image (potentially blurred) and events based optical flow estimation approach.
arXiv Detail & Related papers (2020-04-01T11:28:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.