Dense Optical Flow from Event Cameras
- URL: http://arxiv.org/abs/2108.10552v1
- Date: Tue, 24 Aug 2021 07:39:08 GMT
- Title: Dense Optical Flow from Event Cameras
- Authors: Mathias Gehrig and Mario Millh\"ausler and Daniel Gehrig and Davide
Scaramuzza
- Abstract summary: We propose to incorporate feature correlation and sequential processing into dense optical flow estimation from event cameras.
Our proposed approach computes dense optical flow and reduces the end-point error by 23% on MVSEC.
- Score: 55.79329250951028
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We propose to incorporate feature correlation and sequential processing into
dense optical flow estimation from event cameras. Modern frame-based optical
flow methods heavily rely on matching costs computed from feature correlation.
In contrast, there exists no optical flow method for event cameras that
explicitly computes matching costs. Instead, learning-based approaches using
events usually resort to the U-Net architecture to estimate optical flow
sparsely. Our key finding is that the introduction of correlation features
significantly improves results compared to previous methods that solely rely on
convolution layers. Compared to the state-of-the-art, our proposed approach
computes dense optical flow and reduces the end-point error by 23% on MVSEC.
Furthermore, we show that all existing optical flow methods developed so far
for event cameras have been evaluated on datasets with very small displacement
fields with a maximum flow magnitude of 10 pixels. Based on this observation,
we introduce a new real-world dataset that exhibits displacement fields with
magnitudes up to 210 pixels and 3 times higher camera resolution. Our proposed
approach reduces the end-point error on this dataset by 66%.
Related papers
- Robust Optical Flow Computation: A Higher-Order Differential Approach [0.0]
This research proposes an innovative algorithm for optical flow computation, utilizing the higher precision of second-order Taylor series approximation.
An impressive showcase of the algorithm's capabilities emerges through its performance on optical flow benchmarks such as KITTI and Middlebury.
arXiv Detail & Related papers (2024-10-12T15:20:11Z) - Motion-prior Contrast Maximization for Dense Continuous-Time Motion Estimation [34.529280562470746]
We introduce a novel self-supervised loss combining the Contrast Maximization framework with a non-linear motion prior in the form of pixel-level trajectories.
Their effectiveness is demonstrated in two scenarios: In dense continuous-time motion estimation, our method improves the zero-shot performance of a synthetically trained model by 29%.
arXiv Detail & Related papers (2024-07-15T15:18:28Z) - Rethink Predicting the Optical Flow with the Kinetics Perspective [1.7901503554839604]
Optical flow estimation is one of the fundamental tasks in low-level computer vision.
From the apparent aspect, the optical flow can be viewed as the correlation between the pixels in consecutive frames.
We propose a method combining the apparent and kinetics information from this motivation.
arXiv Detail & Related papers (2024-05-21T05:47:42Z) - Deep Dynamic Scene Deblurring from Optical Flow [53.625999196063574]
Deblurring can provide visually more pleasant pictures and make photography more convenient.
It is difficult to model the non-uniform blur mathematically.
We develop a convolutional neural network (CNN) to restore the sharp images from the deblurred features.
arXiv Detail & Related papers (2023-01-18T06:37:21Z) - PCA Event-Based Otical Flow for Visual Odometry [0.0]
We present a Principal Component Analysis approach to the problem of event-based optical flow estimation.
We show that the best variant of our proposed method, dedicated to the real-time context of visual odometry, is about two times faster compared to state-of-the-art implementations.
arXiv Detail & Related papers (2021-05-08T18:30:44Z) - Learning Optical Flow from a Few Matches [67.83633948984954]
We show that the dense correlation volume representation is redundant and accurate flow estimation can be achieved with only a fraction of elements in it.
Experiments show that our method can reduce computational cost and memory use significantly, while maintaining high accuracy.
arXiv Detail & Related papers (2021-04-05T21:44:00Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - STaRFlow: A SpatioTemporal Recurrent Cell for Lightweight Multi-Frame
Optical Flow Estimation [64.99259320624148]
We present a new lightweight CNN-based algorithm for multi-frame optical flow estimation.
The resulting STaRFlow algorithm gives state-of-the-art performances on MPI Sintel and Kitti2015.
arXiv Detail & Related papers (2020-07-10T17:01:34Z) - Single Image Optical Flow Estimation with an Event Camera [38.92408855196647]
Event cameras are bio-inspired sensors that report intensity changes in microsecond resolution.
We propose a single image (potentially blurred) and events based optical flow estimation approach.
arXiv Detail & Related papers (2020-04-01T11:28:30Z) - Joint Unsupervised Learning of Optical Flow and Egomotion with Bi-Level
Optimization [59.9673626329892]
We exploit the global relationship between optical flow and camera motion using epipolar geometry.
We use implicit differentiation to enable back-propagation through the lower-level geometric optimization layer independent of its implementation.
arXiv Detail & Related papers (2020-02-26T22:28:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.