Spatiotemporal Registration for Event-based Visual Odometry
- URL: http://arxiv.org/abs/2103.05955v1
- Date: Wed, 10 Mar 2021 09:23:24 GMT
- Title: Spatiotemporal Registration for Event-based Visual Odometry
- Authors: Daqi Liu Alvaro Parra and Tat-Jun Chin
- Abstract summary: A useful application of event sensing is visual odometry, especially in settings that require high-temporal resolution.
We propose large registration as a compelling technique for event-based rotational motion estimation.
We also contribute a new event dataset for visual odometry, where motion sequences with large velocity variations were acquired using a high-precision robot arm.
- Score: 40.02502611087858
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A useful application of event sensing is visual odometry, especially in
settings that require high-temporal resolution. The state-of-the-art method of
contrast maximisation recovers the motion from a batch of events by maximising
the contrast of the image of warped events. However, the cost scales with image
resolution and the temporal resolution can be limited by the need for large
batch sizes to yield sufficient structure in the contrast image. In this work,
we propose spatiotemporal registration as a compelling technique for
event-based rotational motion estimation. We theoretcally justify the approach
and establish its fundamental and practical advantages over contrast
maximisation. In particular, spatiotemporal registration also produces feature
tracks as a by-product, which directly supports an efficient visual odometry
pipeline with graph-based optimisation for motion averaging. The simplicity of
our visual odometry pipeline allows it to process more than 1 M events/second.
We also contribute a new event dataset for visual odometry, where motion
sequences with large velocity variations were acquired using a high-precision
robot arm.
Related papers
- Event-based Motion Deblurring via Multi-Temporal Granularity Fusion [5.58706910566768]
Event camera, a bio-inspired sensor offering continuous visual information could enhance the deblurring performance.
Existing event-based image deblurring methods usually utilize voxel-based event representations.
We introduce point cloud-based event representation into the image deblurring task and propose a Multi-Temporal Granularity Network (MTGNet)
It combines the spatially dense but temporally coarse-grained voxel-based event representation and the temporally fine-grained but spatially sparse point cloud-based event.
arXiv Detail & Related papers (2024-12-16T15:20:54Z) - Event-Based Tracking Any Point with Motion-Augmented Temporal Consistency [58.719310295870024]
This paper presents an event-based framework for tracking any point.
It tackles the challenges posed by spatial sparsity and motion sensitivity in events.
It achieves 150% faster processing with competitive model parameters.
arXiv Detail & Related papers (2024-12-02T09:13:29Z) - ESVO2: Direct Visual-Inertial Odometry with Stereo Event Cameras [33.81592783496106]
Event-based visual odometry aims at solving tracking and mapping subproblems (typically in parallel)
We build an event-based stereo visual-inertial odometry system on top of a direct pipeline.
The resulting system scales well with modern high-resolution event cameras.
arXiv Detail & Related papers (2024-10-12T05:35:27Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - MBA-VO: Motion Blur Aware Visual Odometry [99.56896875807635]
Motion blur is one of the major challenges remaining for visual odometry methods.
In low-light conditions where longer exposure times are necessary, motion blur can appear even for relatively slow camera motions.
We present a novel hybrid visual odometry pipeline with direct approach that explicitly models and estimates the camera's local trajectory within the exposure time.
arXiv Detail & Related papers (2021-03-25T09:02:56Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Single Image Optical Flow Estimation with an Event Camera [38.92408855196647]
Event cameras are bio-inspired sensors that report intensity changes in microsecond resolution.
We propose a single image (potentially blurred) and events based optical flow estimation approach.
arXiv Detail & Related papers (2020-04-01T11:28:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.