EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset
- URL: http://arxiv.org/abs/2312.08220v1
- Date: Wed, 13 Dec 2023 15:42:04 GMT
- Title: EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset
- Authors: Peiqi Duan, Boyu Li, Yixin Yang, Hanyue Lou, Minggui Teng, Yi Ma,
Boxin Shi
- Abstract summary: Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
- Score: 55.12137324648253
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras are emerging imaging technology that offers advantages over
conventional frame-based imaging sensors in dynamic range and sensing speed.
Complementing the rich texture and color perception of traditional image
frames, the hybrid camera system of event and frame-based cameras enables
high-performance imaging. With the assistance of event cameras, high-quality
image/video enhancement methods make it possible to break the limits of
traditional frame-based cameras, especially exposure time, resolution, dynamic
range, and frame rate limits. This paper focuses on five event-aided image and
video enhancement tasks (i.e., event-based video reconstruction, event-aided
high frame rate video reconstruction, image deblurring, image super-resolution,
and high dynamic range image reconstruction), provides an analysis of the
effects of different event properties, a real-captured and ground truth labeled
benchmark dataset, a unified benchmarking of state-of-the-art methods, and an
evaluation for two mainstream event simulators. In detail, this paper collects
a real-captured evaluation dataset EventAid for five event-aided image/video
enhancement tasks, by using "Event-RGB" multi-camera hybrid system, taking into
account scene diversity and spatiotemporal synchronization. We further perform
quantitative and visual comparisons for state-of-the-art algorithms, provide a
controlled experiment to analyze the performance limit of event-aided image
deblurring methods, and discuss open problems to inspire future research.
Related papers
- EF-3DGS: Event-Aided Free-Trajectory 3D Gaussian Splatting [76.02450110026747]
Event cameras, inspired by biological vision, record pixel-wise intensity changes asynchronously with high temporal resolution.
We propose Event-Aided Free-Trajectory 3DGS, which seamlessly integrates the advantages of event cameras into 3DGS.
We evaluate our method on the public Tanks and Temples benchmark and a newly collected real-world dataset, RealEv-DAVIS.
arXiv Detail & Related papers (2024-10-20T13:44:24Z) - E2HQV: High-Quality Video Generation from Event Camera via
Theory-Inspired Model-Aided Deep Learning [53.63364311738552]
Bio-inspired event cameras or dynamic vision sensors are capable of capturing per-pixel brightness changes (called event-streams) in high temporal resolution and high dynamic range.
It calls for events-to-video (E2V) solutions which take event-streams as input and generate high quality video frames for intuitive visualization.
We propose textbfE2HQV, a novel E2V paradigm designed to produce high-quality video frames from events.
arXiv Detail & Related papers (2024-01-16T05:10:50Z) - Event-based Continuous Color Video Decompression from Single Frames [38.59798259847563]
We present ContinuityCam, a novel approach to generate a continuous video from a single static RGB image, using an event camera.
Our approach combines continuous long-range motion modeling with a feature-plane-based neural integration model, enabling frame prediction at arbitrary times within the events.
arXiv Detail & Related papers (2023-11-30T18:59:23Z) - Event-based Background-Oriented Schlieren [18.2247510082534]
Schlieren imaging is an optical technique to observe the flow of transparent media, such as air or water, without any particle seeding.
Event cameras offer potential advantages (high dynamic range, high temporal resolution, and data efficiency) to overcome such limitations due to their bio-inspired sensing principle.
This paper presents a novel technique for perceiving air convection using events and frames by providing the first theoretical analysis that connects event data and schlieren.
arXiv Detail & Related papers (2023-11-01T10:57:20Z) - An Asynchronous Linear Filter Architecture for Hybrid Event-Frame Cameras [9.69495347826584]
We present an asynchronous linear filter architecture, fusing event and frame camera data, for HDR video reconstruction and spatial convolution.
The proposed AKF pipeline outperforms other state-of-the-art methods in both absolute intensity error (69.4% reduction) and image similarity indexes (average 35.5% improvement)
arXiv Detail & Related papers (2023-09-03T12:37:59Z) - Revisiting Event-based Video Frame Interpolation [49.27404719898305]
Dynamic vision sensors or event cameras provide rich complementary information for video frame.
estimating optical flow from events is arguably more difficult than from RGB information.
We propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages.
arXiv Detail & Related papers (2023-07-24T06:51:07Z) - EVREAL: Towards a Comprehensive Benchmark and Analysis Suite for Event-based Video Reconstruction [16.432164340779266]
Event cameras offer advantages over traditional frame-based cameras such as high dynamic range and minimal motion blur.
Their output is not easily understandable by humans, making reconstruction of intensity images from event streams a fundamental task in event-based vision.
Recent deep learning-based methods have shown promise in video reconstruction from events, but this problem is not completely solved yet.
arXiv Detail & Related papers (2023-04-30T09:28:38Z) - MEFNet: Multi-scale Event Fusion Network for Motion Deblurring [62.60878284671317]
Traditional frame-based cameras inevitably suffer from motion blur due to long exposure times.
As a kind of bio-inspired camera, the event camera records the intensity changes in an asynchronous way with high temporal resolution.
In this paper, we rethink the event-based image deblurring problem and unfold it into an end-to-end two-stage image restoration network.
arXiv Detail & Related papers (2021-11-30T23:18:35Z) - Learning to Detect Objects with a 1 Megapixel Event Camera [14.949946376335305]
Event cameras encode visual information with high temporal precision, low data-rate, and high-dynamic range.
Due to the novelty of the field, the performance of event-based systems on many vision tasks is still lower compared to conventional frame-based solutions.
arXiv Detail & Related papers (2020-09-28T16:03:59Z) - Reducing the Sim-to-Real Gap for Event Cameras [64.89183456212069]
Event cameras are paradigm-shifting novel sensors that report asynchronous, per-pixel brightness changes called 'events' with unparalleled low latency.
Recent work has demonstrated impressive results using Convolutional Neural Networks (CNNs) for video reconstruction and optic flow with events.
We present strategies for improving training data for event based CNNs that result in 20-40% boost in performance of existing video reconstruction networks.
arXiv Detail & Related papers (2020-03-20T02:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.