Towards Real-Time Fast Unmanned Aerial Vehicle Detection Using Dynamic Vision Sensors
- URL: http://arxiv.org/abs/2403.11875v1
- Date: Mon, 18 Mar 2024 15:27:58 GMT
- Title: Towards Real-Time Fast Unmanned Aerial Vehicle Detection Using Dynamic Vision Sensors
- Authors: Jakub Mandula, Jonas Kühne, Luca Pascarella, Michele Magno,
- Abstract summary: Unmanned Aerial Vehicles (UAVs) are gaining popularity in civil and military applications.
prevention and detection of UAVs are pivotal to guarantee confidentiality and safety.
This paper presents F-UAV-D (Fast Unmanned Aerial Vehicle Detector), an embedded system that enables fast-moving drone detection.
- Score: 6.03212980984729
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unmanned Aerial Vehicles (UAVs) are gaining popularity in civil and military applications. However, uncontrolled access to restricted areas threatens privacy and security. Thus, prevention and detection of UAVs are pivotal to guarantee confidentiality and safety. Although active scanning, mainly based on radars, is one of the most accurate technologies, it can be expensive and less versatile than passive inspections, e.g., object recognition. Dynamic vision sensors (DVS) are bio-inspired event-based vision models that leverage timestamped pixel-level brightness changes in fast-moving scenes that adapt well to low-latency object detection. This paper presents F-UAV-D (Fast Unmanned Aerial Vehicle Detector), an embedded system that enables fast-moving drone detection. In particular, we propose a setup to exploit DVS as an alternative to RGB cameras in a real-time and low-power configuration. Our approach leverages the high-dynamic range (HDR) and background suppression of DVS and, when trained with various fast-moving drones, outperforms RGB input in suboptimal ambient conditions such as low illumination and fast-moving scenes. Our results show that F-UAV-D can (i) detect drones by using less than <15 W on average and (ii) perform real-time inference (i.e., <50 ms) by leveraging the CPU and GPU nodes of our edge computer.
Related papers
- Neuromorphic Drone Detection: an Event-RGB Multimodal Approach [25.26674905726921]
Neuromorphic cameras can retain precise and rich-temporal information in situations that are challenging for RGB cameras.
We present a novel model for integrating both domains together, leveraging multimodal data.
We also release NeRDD (Neuromorphic-RGB Drone Detection), a novel-temporally Event synchronized-RGB Drone detection dataset.
arXiv Detail & Related papers (2024-09-24T13:53:20Z) - UAVDB: Trajectory-Guided Adaptable Bounding Boxes for UAV Detection [0.03464344220266879]
Patch Intensity Convergence (PIC) technique generates high-fidelity bounding boxes for UAV detection without manual labeling.
This technique forms the foundation of UAVDB, a dedicated database designed specifically for UAV detection.
We benchmark UAVDB using state-of-the-art (SOTA) YOLO series detectors, providing a comprehensive performance analysis.
arXiv Detail & Related papers (2024-09-09T13:27:53Z) - Radar Fields: Frequency-Space Neural Scene Representations for FMCW Radar [62.51065633674272]
We introduce Radar Fields - a neural scene reconstruction method designed for active radar imagers.
Our approach unites an explicit, physics-informed sensor model with an implicit neural geometry and reflectance model to directly synthesize raw radar measurements.
We validate the effectiveness of the method across diverse outdoor scenarios, including urban scenes with dense vehicles and infrastructure.
arXiv Detail & Related papers (2024-05-07T20:44:48Z) - Evidential Detection and Tracking Collaboration: New Problem, Benchmark
and Algorithm for Robust Anti-UAV System [56.51247807483176]
Unmanned Aerial Vehicles (UAVs) have been widely used in many areas, including transportation, surveillance, and military.
Previous works have simplified such an anti-UAV task as a tracking problem, where prior information of UAVs is always provided.
In this paper, we first formulate a new and practical anti-UAV problem featuring the UAVs perception in complex scenes without prior UAVs information.
arXiv Detail & Related papers (2023-06-27T19:30:23Z) - TF-Net: Deep Learning Empowered Tiny Feature Network for Night-time UAV
Detection [10.43480599406243]
This paper uses a deep learning-based TinyFeatureNet (TF-Net) to accurately detect UAVs during the night using infrared (IR) images.
The results showed better performance for the proposed TF-Net in terms of precision, IoU, GFLOPS, model size, and FPS.
arXiv Detail & Related papers (2022-11-29T15:58:36Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - Drone Detection and Tracking in Real-Time by Fusion of Different Sensing
Modalities [66.4525391417921]
We design and evaluate a multi-sensor drone detection system.
Our solution integrates a fish-eye camera as well to monitor a wider part of the sky and steer the other cameras towards objects of interest.
The thermal camera is shown to be a feasible solution as good as the video camera, even if the camera employed here has a lower resolution.
arXiv Detail & Related papers (2022-07-05T10:00:58Z) - Fail-Safe Human Detection for Drones Using a Multi-Modal Curriculum
Learning Approach [1.094245191265935]
We present KUL-UAVSAFE, a first-of-its-kind dataset for the study of safety-critical people detection by drones.
We propose a CNN architecture with cross-fusion highways and introduce a curriculum learning strategy for multi-modal data.
arXiv Detail & Related papers (2021-09-28T12:34:13Z) - UAV-ReID: A Benchmark on Unmanned Aerial Vehicle Re-identification [21.48667873335246]
Recent development in deep learning allows vision-based counter-UAV systems to detect and track UAVs with a single camera.
The coverage of a single camera is limited, necessitating the need for multicamera configurations to match UAVs across cameras.
We propose the first new UAV re-identification data set, UAV-reID, that facilitates the development of machine learning solutions in this emerging area.
arXiv Detail & Related papers (2021-04-13T14:13:09Z) - Perceiving Traffic from Aerial Images [86.994032967469]
We propose an object detection method called Butterfly Detector that is tailored to detect objects in aerial images.
We evaluate our Butterfly Detector on two publicly available UAV datasets (UAVDT and VisDrone 2019) and show that it outperforms previous state-of-the-art methods while remaining real-time.
arXiv Detail & Related papers (2020-09-16T11:37:43Z) - Drone-based RGB-Infrared Cross-Modality Vehicle Detection via
Uncertainty-Aware Learning [59.19469551774703]
Drone-based vehicle detection aims at finding the vehicle locations and categories in an aerial image.
We construct a large-scale drone-based RGB-Infrared vehicle detection dataset, termed DroneVehicle.
Our DroneVehicle collects 28, 439 RGB-Infrared image pairs, covering urban roads, residential areas, parking lots, and other scenarios from day to night.
arXiv Detail & Related papers (2020-03-05T05:29:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.