Research, Applications and Prospects of Event-Based Pedestrian Detection: A Survey
- URL: http://arxiv.org/abs/2407.04277v1
- Date: Fri, 5 Jul 2024 06:17:00 GMT
- Title: Research, Applications and Prospects of Event-Based Pedestrian Detection: A Survey
- Authors: Han Wang, Yuman Nie, Yun Li, Hongjie Liu, Min Liu, Wen Cheng, Yaoxiong Wang,
- Abstract summary: Event-based cameras, inspired by the biological retina, have evolved into cutting-edge sensors distinguished by their minimal power requirements, negligible latency, superior temporal resolution, and expansive dynamic range.
Event-based cameras address limitations by eschewing extraneous data transmissions and obviating motion blur in high-speed imaging scenarios.
This paper offers an exhaustive review of research and applications particularly in the autonomous driving context.
- Score: 10.494414329120909
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Event-based cameras, inspired by the biological retina, have evolved into cutting-edge sensors distinguished by their minimal power requirements, negligible latency, superior temporal resolution, and expansive dynamic range. At present, cameras used for pedestrian detection are mainly frame-based imaging sensors, which have suffered from lethargic response times and hefty data redundancy. In contrast, event-based cameras address these limitations by eschewing extraneous data transmissions and obviating motion blur in high-speed imaging scenarios. On pedestrian detection via event-based cameras, this paper offers an exhaustive review of research and applications particularly in the autonomous driving context. Through methodically scrutinizing relevant literature, the paper outlines the foundational principles, developmental trajectory, and the comparative merits and demerits of eventbased detection relative to traditional frame-based methodologies. This review conducts thorough analyses of various event stream inputs and their corresponding network models to evaluate their applicability across diverse operational environments. It also delves into pivotal elements such as crucial datasets and data acquisition techniques essential for advancing this technology, as well as advanced algorithms for processing event stream data. Culminating with a synthesis of the extant landscape, the review accentuates the unique advantages and persistent challenges inherent in event-based pedestrian detection, offering a prognostic view on potential future developments in this fast-progressing field.
Related papers
- Event-based Sensor Fusion and Application on Odometry: A Survey [2.3717744547851627]
Event cameras offer advantages in environments characterized by high-speed motion, low lighting, or wide dynamic range.
These properties render event cameras particularly effective for sensor fusion in robotics and computer vision.
arXiv Detail & Related papers (2024-10-20T19:32:25Z) - Descriptor: Face Detection Dataset for Programmable Threshold-Based Sparse-Vision [0.8271394038014485]
This dataset is an annotated, temporal-threshold-based vision dataset for face detection tasks derived from the same videos used for Aff-Wild2.
We anticipate that this resource will significantly support the development of robust vision systems based on smart sensors that can process based on temporal-difference thresholds.
arXiv Detail & Related papers (2024-10-01T03:42:03Z) - DailyDVS-200: A Comprehensive Benchmark Dataset for Event-Based Action Recognition [51.96660522869841]
DailyDVS-200 is a benchmark dataset tailored for the event-based action recognition community.
It covers 200 action categories across real-world scenarios, recorded by 47 participants, and comprises more than 22,000 event sequences.
DailyDVS-200 is annotated with 14 attributes, ensuring a detailed characterization of the recorded actions.
arXiv Detail & Related papers (2024-07-06T15:25:10Z) - Deep Event-based Object Detection in Autonomous Driving: A Survey [7.197775088663435]
Event cameras have emerged as promising sensors for autonomous driving due to their low latency, high dynamic range, and low power consumption.
This paper provides an overview of object detection using event data in autonomous driving, showcasing the competitive benefits of event cameras.
arXiv Detail & Related papers (2024-05-07T04:17:04Z) - OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising [49.86409475232849]
Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
arXiv Detail & Related papers (2024-04-02T18:30:29Z) - SpikeMOT: Event-based Multi-Object Tracking with Sparse Motion Features [52.213656737672935]
SpikeMOT is an event-based multi-object tracker.
SpikeMOT uses spiking neural networks to extract sparsetemporal features from event streams associated with objects.
arXiv Detail & Related papers (2023-09-29T05:13:43Z) - On the Generation of a Synthetic Event-Based Vision Dataset for
Navigation and Landing [69.34740063574921]
This paper presents a methodology for generating event-based vision datasets from optimal landing trajectories.
We construct sequences of photorealistic images of the lunar surface with the Planet and Asteroid Natural Scene Generation Utility.
We demonstrate that the pipeline can generate realistic event-based representations of surface features by constructing a dataset of 500 trajectories.
arXiv Detail & Related papers (2023-08-01T09:14:20Z) - FEDORA: Flying Event Dataset fOr Reactive behAvior [9.470870778715689]
Event-based sensors have emerged as low latency and low energy alternatives to standard frame-based cameras for capturing high-speed motion.
We present Flying Event dataset fOr Reactive behAviour (FEDORA) - a fully synthetic dataset for perception tasks.
arXiv Detail & Related papers (2023-05-22T22:59:05Z) - Event-based Simultaneous Localization and Mapping: A Comprehensive Survey [52.73728442921428]
Review of event-based vSLAM algorithms that exploit the benefits of asynchronous and irregular event streams for localization and mapping tasks.
Paper categorizes event-based vSLAM methods into four main categories: feature-based, direct, motion-compensation, and deep learning methods.
arXiv Detail & Related papers (2023-04-19T16:21:14Z) - Learning Monocular Dense Depth from Events [53.078665310545745]
Event cameras produce brightness changes in the form of a stream of asynchronous events instead of intensity frames.
Recent learning-based approaches have been applied to event-based data, such as monocular depth prediction.
We propose a recurrent architecture to solve this task and show significant improvement over standard feed-forward methods.
arXiv Detail & Related papers (2020-10-16T12:36:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.