Event-based Image Deblurring with Dynamic Motion Awareness
- URL: http://arxiv.org/abs/2208.11398v1
- Date: Wed, 24 Aug 2022 09:39:55 GMT
- Title: Event-based Image Deblurring with Dynamic Motion Awareness
- Authors: Patricia Vitoria, Stamatios Georgoulis, Stepan Tulyakov, Alfredo
Bochicchio, Julius Erbach, Yuanyou Li
- Abstract summary: We introduce the first dataset containing pairs of real RGB blur images and related events during the exposure time.
Our results show better robustness overall when using events, with improvements in PSNR by up to 1.57dB on synthetic data and 1.08 dB on real event data.
- Score: 10.81953574179206
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-uniform image deblurring is a challenging task due to the lack of
temporal and textural information in the blurry image itself. Complementary
information from auxiliary sensors such event sensors are being explored to
address these limitations. The latter can record changes in a logarithmic
intensity asynchronously, called events, with high temporal resolution and high
dynamic range. Current event-based deblurring methods combine the blurry image
with events to jointly estimate per-pixel motion and the deblur operator. In
this paper, we argue that a divide-and-conquer approach is more suitable for
this task. To this end, we propose to use modulated deformable convolutions,
whose kernel offsets and modulation masks are dynamically estimated from events
to encode the motion in the scene, while the deblur operator is learned from
the combination of blurry image and corresponding events. Furthermore, we
employ a coarse-to-fine multi-scale reconstruction approach to cope with the
inherent sparsity of events in low contrast regions. Importantly, we introduce
the first dataset containing pairs of real RGB blur images and related events
during the exposure time. Our results show better overall robustness when using
events, with improvements in PSNR by up to 1.57dB on synthetic data and 1.08 dB
on real event data.
Related papers
- Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Learning Parallax for Stereo Event-based Motion Deblurring [8.201943408103995]
Existing approaches rely on the perfect pixel-wise alignment between intensity images and events, which is not always fulfilled in the real world.
We propose a novel coarse-to-fine framework, named NETwork of Event-based motion Deblurring with STereo event and intensity cameras (St-EDNet)
We build a new dataset with STereo Event and Intensity Cameras (StEIC), containing real-world events, intensity images, and dense disparity maps.
arXiv Detail & Related papers (2023-09-18T06:51:41Z) - Deformable Neural Radiance Fields using RGB and Event Cameras [65.40527279809474]
We develop a novel method to model the deformable neural radiance fields using RGB and event cameras.
The proposed method uses the asynchronous stream of events and sparse RGB frames.
Experiments conducted on both realistically rendered graphics and real-world datasets demonstrate a significant benefit of the proposed method.
arXiv Detail & Related papers (2023-09-15T14:19:36Z) - Revisiting Event-based Video Frame Interpolation [49.27404719898305]
Dynamic vision sensors or event cameras provide rich complementary information for video frame.
estimating optical flow from events is arguably more difficult than from RGB information.
We propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages.
arXiv Detail & Related papers (2023-07-24T06:51:07Z) - Recovering Continuous Scene Dynamics from A Single Blurry Image with
Events [58.7185835546638]
An Implicit Video Function (IVF) is learned to represent a single motion blurred image with concurrent events.
A dual attention transformer is proposed to efficiently leverage merits from both modalities.
The proposed network is trained only with the supervision of ground-truth images of limited referenced timestamps.
arXiv Detail & Related papers (2023-04-05T18:44:17Z) - Time Lens++: Event-based Frame Interpolation with Parametric Non-linear
Flow and Multi-scale Fusion [47.57998625129672]
We introduce multi-scale feature-level fusion and computing one-shot non-linear inter-frame motion from events and images.
We show that our method improves the reconstruction quality by up to 0.2 dB in terms of PSNR and up to 15% in LPIPS score.
arXiv Detail & Related papers (2022-03-31T17:14:58Z) - MEFNet: Multi-scale Event Fusion Network for Motion Deblurring [62.60878284671317]
Traditional frame-based cameras inevitably suffer from motion blur due to long exposure times.
As a kind of bio-inspired camera, the event camera records the intensity changes in an asynchronous way with high temporal resolution.
In this paper, we rethink the event-based image deblurring problem and unfold it into an end-to-end two-stage image restoration network.
arXiv Detail & Related papers (2021-11-30T23:18:35Z) - Bridging the Gap between Events and Frames through Unsupervised Domain
Adaptation [57.22705137545853]
We propose a task transfer method that allows models to be trained directly with labeled images and unlabeled event data.
We leverage the generative event model to split event features into content and motion features.
Our approach unlocks the vast amount of existing image datasets for the training of event-based neural networks.
arXiv Detail & Related papers (2021-09-06T17:31:37Z) - An Asynchronous Kalman Filter for Hybrid Event Cameras [13.600773150848543]
Event cameras are ideally suited to capture HDR visual information without blur.
conventional image sensors measure absolute intensity of slowly changing scenes effectively but do poorly on high dynamic range or quickly changing scenes.
We present an event-based video reconstruction pipeline for High Dynamic Range scenarios.
arXiv Detail & Related papers (2020-12-10T11:24:07Z) - Single Image Optical Flow Estimation with an Event Camera [38.92408855196647]
Event cameras are bio-inspired sensors that report intensity changes in microsecond resolution.
We propose a single image (potentially blurred) and events based optical flow estimation approach.
arXiv Detail & Related papers (2020-04-01T11:28:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.