An Event-based Fast Intensity Reconstruction Scheme for UAV Real-time Perception
- URL: http://arxiv.org/abs/2508.02238v1
- Date: Mon, 04 Aug 2025 09:37:00 GMT
- Title: An Event-based Fast Intensity Reconstruction Scheme for UAV Real-time Perception
- Authors: Xin Dong, Yiwei Zhang, Yangjie Cui, Jinwu Xiang, Daochun Li, Zhan Tu,
- Abstract summary: Event cameras offer a wide dynamic range, high temporal resolution, and immunity to motion blur.<n>Extracting and utilizing effective information from asynchronous event streams is essential for the onboard implementation of event cameras.<n>We propose an event-based intensity reconstruction scheme, event-based single integration (ESI), to address such implementation challenges.
- Score: 14.744985063618884
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Event cameras offer significant advantages, including a wide dynamic range, high temporal resolution, and immunity to motion blur, making them highly promising for addressing challenging visual conditions. Extracting and utilizing effective information from asynchronous event streams is essential for the onboard implementation of event cameras. In this paper, we propose a streamlined event-based intensity reconstruction scheme, event-based single integration (ESI), to address such implementation challenges. This method guarantees the portability of conventional frame-based vision methods to event-based scenarios and maintains the intrinsic advantages of event cameras. The ESI approach reconstructs intensity images by performing a single integration of the event streams combined with an enhanced decay algorithm. Such a method enables real-time intensity reconstruction at a high frame rate, typically 100 FPS. Furthermore, the relatively low computation load of ESI fits onboard implementation suitably, such as in UAV-based visual tracking scenarios. Extensive experiments have been conducted to evaluate the performance comparison of ESI and state-of-the-art algorithms. Compared to state-of-the-art algorithms, ESI demonstrates remarkable runtime efficiency improvements, superior reconstruction quality, and a high frame rate. As a result, ESI enhances UAV onboard perception significantly under visual adversary surroundings. In-flight tests, ESI demonstrates effective performance for UAV onboard visual tracking under extremely low illumination conditions(2-10lux), whereas other comparative algorithms fail due to insufficient frame rate, poor image quality, or limited real-time performance.
Related papers
- Event Signal Filtering via Probability Flux Estimation [58.31652473933809]
Events offer a novel paradigm for capturing scene dynamics via asynchronous sensing, but their inherent randomness often leads to degraded signal quality.<n>Event signal filtering is thus essential for enhancing fidelity by reducing this internal randomness and ensuring consistent outputs across diverse acquisition conditions.<n>This paper introduces a generative, online filtering framework called Event Density Flow Filter (EDFilter)<n>Experiments validate EDFilter's performance across tasks like event filtering, super-resolution, and direct event-based blob tracking.
arXiv Detail & Related papers (2025-04-10T07:03:08Z) - SuperEIO: Self-Supervised Event Feature Learning for Event Inertial Odometry [6.552812892993662]
Event cameras asynchronously output low-latency event streams, promising for state estimation in high-speed motion and challenging lighting conditions.<n>We propose SuperEIO, a novel framework that leverages the learning-based event-only detection and IMU measurements to achieve eventinertial odometry.<n>We evaluate our method extensively on multiple public datasets, demonstrating its superior accuracy and robustness compared to other state-of-the-art event-based methods.
arXiv Detail & Related papers (2025-03-29T03:58:15Z) - EventSplat: 3D Gaussian Splatting from Moving Event Cameras for Real-time Rendering [7.392798832833857]
Event cameras offer exceptional temporal resolution and a high dynamic range.<n>We introduce a method for using event camera data in novel view synthesis via Gaussian Splatting.
arXiv Detail & Related papers (2024-12-10T08:23:58Z) - Self-supervised Event-based Monocular Depth Estimation using Cross-modal
Consistency [18.288912105820167]
We propose a self-supervised event-based monocular depth estimation framework named EMoDepth.
EMoDepth constrains the training process using the cross-modal consistency from intensity frames that are aligned with events in the pixel coordinate.
In inference, only events are used for monocular depth prediction.
arXiv Detail & Related papers (2024-01-14T07:16:52Z) - EventAid: Benchmarking Event-aided Image/Video Enhancement Algorithms
with Real-captured Hybrid Dataset [55.12137324648253]
Event cameras are emerging imaging technology that offers advantages over conventional frame-based imaging sensors in dynamic range and sensing speed.
This paper focuses on five event-aided image and video enhancement tasks.
arXiv Detail & Related papers (2023-12-13T15:42:04Z) - Implicit Event-RGBD Neural SLAM [54.74363487009845]
Implicit neural SLAM has achieved remarkable progress recently.
Existing methods face significant challenges in non-ideal scenarios.
We propose EN-SLAM, the first event-RGBD implicit neural SLAM framework.
arXiv Detail & Related papers (2023-11-18T08:48:58Z) - Revisiting Event-based Video Frame Interpolation [49.27404719898305]
Dynamic vision sensors or event cameras provide rich complementary information for video frame.
estimating optical flow from events is arguably more difficult than from RGB information.
We propose a divide-and-conquer strategy in which event-based intermediate frame synthesis happens incrementally in multiple simplified stages.
arXiv Detail & Related papers (2023-07-24T06:51:07Z) - Ego-motion Estimation Based on Fusion of Images and Events [0.0]
Event camera is a novel bio-inspired vision sensor that outputs event stream.
We propose a novel data fusion algorithm called EAS to fuse conventional intensity images with the event stream.
arXiv Detail & Related papers (2022-07-12T15:10:28Z) - Asynchronous Optimisation for Event-based Visual Odometry [53.59879499700895]
Event cameras open up new possibilities for robotic perception due to their low latency and high dynamic range.
We focus on event-based visual odometry (VO)
We propose an asynchronous structure-from-motion optimisation back-end.
arXiv Detail & Related papers (2022-03-02T11:28:47Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z) - Event-based visual place recognition with ensembles of temporal windows [29.6328152991222]
Event cameras are bio-inspired sensors capable of providing a continuous stream of events with low latency and high range.
We develop a novel ensemble-based scheme for combining temporal windows of varying lengths that are processed in parallel.
We show that our proposed ensemble scheme significantly outperforms all single-window baselines and conventional model-based ensembles.
arXiv Detail & Related papers (2020-05-22T05:33:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.