Exposure Trajectory Recovery from Motion Blur
- URL: http://arxiv.org/abs/2010.02484v2
- Date: Mon, 4 Oct 2021 09:10:55 GMT
- Title: Exposure Trajectory Recovery from Motion Blur
- Authors: Youjian Zhang, Chaoyue Wang, Stephen J. Maybank, Dacheng Tao
- Abstract summary: Motion blur in dynamic scenes is an important yet challenging research topic.
In this paper, we define exposure trajectories, which represent the motion information contained in a blurry image.
A novel motion offset estimation framework is proposed to model pixel-wise displacements of the latent sharp image.
- Score: 90.75092808213371
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Motion blur in dynamic scenes is an important yet challenging research topic.
Recently, deep learning methods have achieved impressive performance for
dynamic scene deblurring. However, the motion information contained in a blurry
image has yet to be fully explored and accurately formulated because: (i) the
ground truth of dynamic motion is difficult to obtain; (ii) the temporal
ordering is destroyed during the exposure; and (iii) the motion estimation from
a blurry image is highly ill-posed. By revisiting the principle of camera
exposure, motion blur can be described by the relative motions of sharp content
with respect to each exposed position. In this paper, we define exposure
trajectories, which represent the motion information contained in a blurry
image and explain the causes of motion blur. A novel motion offset estimation
framework is proposed to model pixel-wise displacements of the latent sharp
image at multiple timepoints. Under mild constraints, our method can recover
dense, (non-)linear exposure trajectories, which significantly reduce temporal
disorder and ill-posed problems. Finally, experiments demonstrate that the
recovered exposure trajectories not only capture accurate and interpretable
motion information from a blurry image, but also benefit motion-aware image
deblurring and warping-based video extraction tasks. Codes are available on
https://github.com/yjzhang96/Motion-ETR.
Related papers
- Motion Blur Decomposition with Cross-shutter Guidance [33.72961622720793]
Motion blur is an artifact under insufficient illumination where exposure time has to be prolonged so as to collect more photons for a bright enough image.
Recent researches have aimed at decomposing a blurry image into multiple sharp images with spatial and temporal coherence.
We propose to utilize the ordered scanline-wise delay in a rolling shutter image to robustify motion decomposition of a single blurry image.
arXiv Detail & Related papers (2024-04-01T13:55:40Z) - SMURF: Continuous Dynamics for Motion-Deblurring Radiance Fields [14.681688453270523]
We propose sequential motion understanding radiance fields (SMURF), a novel approach that employs neural ordinary differential equation (Neural-ODE) to model continuous camera motion.
Our model, rigorously evaluated against benchmark datasets, demonstrates state-of-the-art performance both quantitatively and qualitatively.
arXiv Detail & Related papers (2024-03-12T11:32:57Z) - Recovering Continuous Scene Dynamics from A Single Blurry Image with
Events [58.7185835546638]
An Implicit Video Function (IVF) is learned to represent a single motion blurred image with concurrent events.
A dual attention transformer is proposed to efficiently leverage merits from both modalities.
The proposed network is trained only with the supervision of ground-truth images of limited referenced timestamps.
arXiv Detail & Related papers (2023-04-05T18:44:17Z) - Learning Spatially Varying Pixel Exposures for Motion Deblurring [49.07867902677453]
We present a novel approach of leveraging spatially varying pixel exposures for motion deblurring.
Our work illustrates the promising role that focal-plane sensor--processors can play in the future of computational imaging.
arXiv Detail & Related papers (2022-04-14T23:41:49Z) - A Method For Adding Motion-Blur on Arbitrary Objects By using
Auto-Segmentation and Color Compensation Techniques [6.982738885923204]
In this paper, an unified framework to add motion blur on per-object basis is proposed.
In the method, multiple frames are captured without motion blur and they are accumulated to create motion blur on target objects.
arXiv Detail & Related papers (2021-09-22T05:52:27Z) - MBA-VO: Motion Blur Aware Visual Odometry [99.56896875807635]
Motion blur is one of the major challenges remaining for visual odometry methods.
In low-light conditions where longer exposure times are necessary, motion blur can appear even for relatively slow camera motions.
We present a novel hybrid visual odometry pipeline with direct approach that explicitly models and estimates the camera's local trajectory within the exposure time.
arXiv Detail & Related papers (2021-03-25T09:02:56Z) - Motion-blurred Video Interpolation and Extrapolation [72.3254384191509]
We present a novel framework for deblurring, interpolating and extrapolating sharp frames from a motion-blurred video in an end-to-end manner.
To ensure temporal coherence across predicted frames and address potential temporal ambiguity, we propose a simple, yet effective flow-based rule.
arXiv Detail & Related papers (2021-03-04T12:18:25Z) - Event-based Motion Segmentation with Spatio-Temporal Graph Cuts [51.17064599766138]
We have developed a method to identify independently objects acquired with an event-based camera.
The method performs on par or better than the state of the art without having to predetermine the number of expected moving objects.
arXiv Detail & Related papers (2020-12-16T04:06:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.