From Events to Enhancement: A Survey on Event-Based Imaging Technologies
- URL: http://arxiv.org/abs/2505.05488v1
- Date: Wed, 30 Apr 2025 00:42:06 GMT
- Title: From Events to Enhancement: A Survey on Event-Based Imaging Technologies
- Authors: Yunfan Lu, Xiaogang Xu, Pengteng Li, Yusheng Wang, Yi Cui, Huizai Yao, Hui Xiong,
- Abstract summary: Event cameras offering high dynamic range and low latency have emerged as disruptive technologies in imaging.<n>Despite growing research on leveraging these benefits for different imaging tasks, a comprehensive study of recently advances and challenges are still lacking.<n>In this survey, we first introduce a physical model and the characteristics of different event sensors as the foundation.<n>Following this, we highlight the advancement and interaction of image/video enhancement tasks with events.
- Score: 25.91883220911079
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Event cameras offering high dynamic range and low latency have emerged as disruptive technologies in imaging. Despite growing research on leveraging these benefits for different imaging tasks, a comprehensive study of recently advances and challenges are still lacking. This limits the broader understanding of how to utilize events in universal imaging applications. In this survey, we first introduce a physical model and the characteristics of different event sensors as the foundation. Following this, we highlight the advancement and interaction of image/video enhancement tasks with events. Additionally, we explore advanced tasks, which capture richer light information with events, \eg~light field estimation, multi-view generation, and photometric. Finally, we discuss new challenges and open questions offering a perspective for this rapidly evolving field. More continuously updated resources are at this link: https://github.com/yunfanLu/Awesome-Event-Imaging
Related papers
- Event-based Solutions for Human-centered Applications: A Comprehensive Review [3.112384742740621]
Event cameras capture changes in light intensity asynchronously, offering exceptional temporal resolution and energy efficiency.<n>Despite growing interest, research in human-centered applications of event cameras remains scattered.<n>This survey bridges that gap by being the first to unify these domains.
arXiv Detail & Related papers (2025-02-17T13:15:19Z) - Infrared and Visible Image Fusion: From Data Compatibility to Task Adaption [65.06388526722186]
Infrared-visible image fusion is a critical task in computer vision.<n>There is a lack of recent comprehensive surveys that address this rapidly expanding domain.<n>We introduce a multi-dimensional framework to elucidate common learning-based IVIF methods.
arXiv Detail & Related papers (2025-01-18T13:17:34Z) - Recent Event Camera Innovations: A Survey [44.34401412004975]
Event-based vision, inspired by the human visual system, offers transformative capabilities such as low latency, high dynamic range, and reduced power consumption.
This paper presents a comprehensive survey of event cameras, tracing their evolution over time.
The survey covers various event camera models from leading manufacturers, key technological milestones, and influential research contributions.
arXiv Detail & Related papers (2024-08-24T16:48:25Z) - Research, Applications and Prospects of Event-Based Pedestrian Detection: A Survey [10.494414329120909]
Event-based cameras, inspired by the biological retina, have evolved into cutting-edge sensors distinguished by their minimal power requirements, negligible latency, superior temporal resolution, and expansive dynamic range.
Event-based cameras address limitations by eschewing extraneous data transmissions and obviating motion blur in high-speed imaging scenarios.
This paper offers an exhaustive review of research and applications particularly in the autonomous driving context.
arXiv Detail & Related papers (2024-07-05T06:17:00Z) - Deformable Neural Radiance Fields using RGB and Event Cameras [65.40527279809474]
We develop a novel method to model the deformable neural radiance fields using RGB and event cameras.
The proposed method uses the asynchronous stream of events and sparse RGB frames.
Experiments conducted on both realistically rendered graphics and real-world datasets demonstrate a significant benefit of the proposed method.
arXiv Detail & Related papers (2023-09-15T14:19:36Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - Deep Learning for Event-based Vision: A Comprehensive Survey and Benchmarks [55.81577205593956]
Event cameras are bio-inspired sensors that capture the per-pixel intensity changes asynchronously.
Deep learning (DL) has been brought to this emerging field and inspired active research endeavors in mining its potential.
arXiv Detail & Related papers (2023-02-17T14:19:28Z) - Matching Neuromorphic Events and Color Images via Adversarial Learning [49.447580124957966]
We propose the Event-Based Image Retrieval (EBIR) problem to exploit the cross-modal matching task.
We address the EBIR problem by proposing neuromorphic Events-Color image Feature Learning (ECFL)
We also contribute to the community N-UKbench and EC180 dataset to promote the development of EBIR problem.
arXiv Detail & Related papers (2020-03-02T02:48:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.