Single Image Optical Flow Estimation with an Event Camera
- URL: http://arxiv.org/abs/2004.00347v1
- Date: Wed, 1 Apr 2020 11:28:30 GMT
- Title: Single Image Optical Flow Estimation with an Event Camera
- Authors: Liyuan Pan, Miaomiao Liu and Richard Hartley
- Abstract summary: Event cameras are bio-inspired sensors that report intensity changes in microsecond resolution.
We propose a single image (potentially blurred) and events based optical flow estimation approach.
- Score: 38.92408855196647
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are bio-inspired sensors that asynchronously report intensity
changes in microsecond resolution. DAVIS can capture high dynamics of a scene
and simultaneously output high temporal resolution events and low frame-rate
intensity images. In this paper, we propose a single image (potentially
blurred) and events based optical flow estimation approach. First, we
demonstrate how events can be used to improve flow estimates. To this end, we
encode the relation between flow and events effectively by presenting an
event-based photometric consistency formulation. Then, we consider the special
case of image blur caused by high dynamics in the visual environments and show
that including the blur formation in our model further constrains flow
estimation. This is in sharp contrast to existing works that ignore the blurred
images while our formulation can naturally handle either blurred or sharp
images to achieve accurate flow estimation. Finally, we reduce flow estimation,
as well as image deblurring, to an alternative optimization problem of an
objective function using the primal-dual algorithm. Experimental results on
both synthetic and real data (with blurred and non-blurred images) show the
superiority of our model in comparison to state-of-the-art approaches.
Related papers
- Motion-prior Contrast Maximization for Dense Continuous-Time Motion Estimation [34.529280562470746]
We introduce a novel self-supervised loss combining the Contrast Maximization framework with a non-linear motion prior in the form of pixel-level trajectories.
Their effectiveness is demonstrated in two scenarios: In dense continuous-time motion estimation, our method improves the zero-shot performance of a synthetically trained model by 29%.
arXiv Detail & Related papers (2024-07-15T15:18:28Z) - Recovering Continuous Scene Dynamics from A Single Blurry Image with
Events [58.7185835546638]
An Implicit Video Function (IVF) is learned to represent a single motion blurred image with concurrent events.
A dual attention transformer is proposed to efficiently leverage merits from both modalities.
The proposed network is trained only with the supervision of ground-truth images of limited referenced timestamps.
arXiv Detail & Related papers (2023-04-05T18:44:17Z) - Learning Dense and Continuous Optical Flow from an Event Camera [28.77846425802558]
Event cameras such as DAVIS can simultaneously output high temporal resolution events and low frame-rate intensity images.
Most of the existing optical flow estimation methods are based on two consecutive image frames and can only estimate discrete flow at a fixed time interval.
We propose a novel deep learning-based dense and continuous optical flow estimation framework from a single image with event streams.
arXiv Detail & Related papers (2022-11-16T17:53:18Z) - Event-based Image Deblurring with Dynamic Motion Awareness [10.81953574179206]
We introduce the first dataset containing pairs of real RGB blur images and related events during the exposure time.
Our results show better robustness overall when using events, with improvements in PSNR by up to 1.57dB on synthetic data and 1.08 dB on real event data.
arXiv Detail & Related papers (2022-08-24T09:39:55Z) - Globally-Optimal Event Camera Motion Estimation [30.79931004393174]
Event cameras are bio-inspired sensors that perform well in HDR conditions and have high temporal resolution.
Event cameras measure asynchronous pixel-level changes and return them in a highly discretised format.
arXiv Detail & Related papers (2022-03-08T08:24:22Z) - Dense Optical Flow from Event Cameras [55.79329250951028]
We propose to incorporate feature correlation and sequential processing into dense optical flow estimation from event cameras.
Our proposed approach computes dense optical flow and reduces the end-point error by 23% on MVSEC.
arXiv Detail & Related papers (2021-08-24T07:39:08Z) - Spatiotemporal Registration for Event-based Visual Odometry [40.02502611087858]
A useful application of event sensing is visual odometry, especially in settings that require high-temporal resolution.
We propose large registration as a compelling technique for event-based rotational motion estimation.
We also contribute a new event dataset for visual odometry, where motion sequences with large velocity variations were acquired using a high-precision robot arm.
arXiv Detail & Related papers (2021-03-10T09:23:24Z) - Optical Flow Estimation from a Single Motion-blurred Image [66.2061278123057]
Motion blur in an image may have practical interests in fundamental computer vision problems.
We propose a novel framework to estimate optical flow from a single motion-blurred image in an end-to-end manner.
arXiv Detail & Related papers (2021-03-04T12:45:18Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Self-Supervised Linear Motion Deblurring [112.75317069916579]
Deep convolutional neural networks are state-of-the-art for image deblurring.
We present a differentiable reblur model for self-supervised motion deblurring.
Our experiments demonstrate that self-supervised single image deblurring is really feasible.
arXiv Detail & Related papers (2020-02-10T20:15:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.