Event-based Continuous Color Video Decompression from Single Frames
- URL: http://arxiv.org/abs/2312.00113v1
- Date: Thu, 30 Nov 2023 18:59:23 GMT
- Title: Event-based Continuous Color Video Decompression from Single Frames
- Authors: Ziyun Wang, Friedhelm Hamann, Kenneth Chaney, Wen Jiang, Guillermo
Gallego, Kostas Daniilidis
- Abstract summary: We present ContinuityCam, a novel approach to generate a continuous video from a single static RGB image, using an event camera.
Our approach combines continuous long-range motion modeling with a feature-plane-based neural integration model, enabling frame prediction at arbitrary times within the events.
- Score: 38.59798259847563
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We present ContinuityCam, a novel approach to generate a continuous video
from a single static RGB image, using an event camera. Conventional cameras
struggle with high-speed motion capture due to bandwidth and dynamic range
limitations. Event cameras are ideal sensors to solve this problem because they
encode compressed change information at high temporal resolution. In this work,
we propose a novel task called event-based continuous color video
decompression, pairing single static color frames and events to reconstruct
temporally continuous videos. Our approach combines continuous long-range
motion modeling with a feature-plane-based synthesis neural integration model,
enabling frame prediction at arbitrary times within the events. Our method does
not rely on additional frames except for the initial image, increasing, thus,
the robustness to sudden light changes, minimizing the prediction latency, and
decreasing the bandwidth requirement. We introduce a novel single objective
beamsplitter setup that acquires aligned images and events and a novel and
challenging Event Extreme Decompression Dataset (E2D2) that tests the method in
various lighting and motion profiles. We thoroughly evaluate our method through
benchmarking reconstruction as well as various downstream tasks. Our approach
significantly outperforms the event- and image- based baselines in the proposed
task.
Related papers
- EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - CMTA: Cross-Modal Temporal Alignment for Event-guided Video Deblurring [44.30048301161034]
Video deblurring aims to enhance the quality of restored results in motion-red videos by gathering information from adjacent video frames.
We propose two modules: 1) Intra-frame feature enhancement operates within the exposure time of a single blurred frame, and 2) Inter-frame temporal feature alignment gathers valuable long-range temporal information to target frames.
We demonstrate that our proposed methods outperform state-of-the-art frame-based and event-based motion deblurring methods through extensive experiments conducted on both synthetic and real-world deblurring datasets.
arXiv Detail & Related papers (2024-08-27T10:09:17Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Revisiting Event-based Video Frame Interpolation [49.27404719898305]
Dynamic vision sensors or event cameras provide rich complementary information for video frame.
estimating optical flow from events is arguably more difficult than from RGB information.
We propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages.
arXiv Detail & Related papers (2023-07-24T06:51:07Z) - An Asynchronous Intensity Representation for Framed and Event Video
Sources [2.9097303137825046]
We introduce an intensity representation for both framed and non-framed data sources.
We show that our representation can increase intensity precision and greatly reduce the number of samples per pixel.
We argue that our method provides the computational efficiency and temporal granularity necessary to build real-time intensity-based applications for event cameras.
arXiv Detail & Related papers (2023-01-20T19:46:23Z) - EventNeRF: Neural Radiance Fields from a Single Colour Event Camera [81.19234142730326]
This paper proposes the first approach for 3D-consistent, dense and novel view synthesis using just a single colour event stream as input.
At its core is a neural radiance field trained entirely in a self-supervised manner from events while preserving the original resolution of the colour event channels.
We evaluate our method qualitatively and numerically on several challenging synthetic and real scenes and show that it produces significantly denser and more visually appealing renderings.
arXiv Detail & Related papers (2022-06-23T17:59:53Z) - TimeLens: Event-based Video Frame Interpolation [54.28139783383213]
We introduce Time Lens, a novel indicates equal contribution method that leverages the advantages of both synthesis-based and flow-based approaches.
We show an up to 5.21 dB improvement in terms of PSNR over state-of-the-art frame-based and event-based methods.
arXiv Detail & Related papers (2021-06-14T10:33:47Z) - An Asynchronous Kalman Filter for Hybrid Event Cameras [13.600773150848543]
Event cameras are ideally suited to capture HDR visual information without blur.
conventional image sensors measure absolute intensity of slowly changing scenes effectively but do poorly on high dynamic range or quickly changing scenes.
We present an event-based video reconstruction pipeline for High Dynamic Range scenarios.
arXiv Detail & Related papers (2020-12-10T11:24:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.