Dogfight: Detecting Drones from Drones Videos
- URL: http://arxiv.org/abs/2103.17242v1
- Date: Wed, 31 Mar 2021 17:43:31 GMT
- Title: Dogfight: Detecting Drones from Drones Videos
- Authors: Muhammad Waseem Ashraf, Waqas Sultani, Mubarak Shah
- Abstract summary: This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
- Score: 58.158988162743825
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As airborne vehicles are becoming more autonomous and ubiquitous, it has
become vital to develop the capability to detect the objects in their
surroundings. This paper attempts to address the problem of drones detection
from other flying drones. The erratic movement of the source and target drones,
small size, arbitrary shape, large intensity variations, and occlusion make
this problem quite challenging. In this scenario, region-proposal based methods
are not able to capture sufficient discriminative foreground-background
information. Also, due to the extremely small size and complex motion of the
source and target drones, feature aggregation based methods are unable to
perform well. To handle this, instead of using region-proposal based methods,
we propose to use a two-stage segmentation-based approach employing
spatio-temporal attention cues. During the first stage, given the overlapping
frame regions, detailed contextual information is captured over convolution
feature maps using pyramid pooling. After that pixel and channel-wise attention
is enforced on the feature maps to ensure accurate drone localization. In the
second stage, first stage detections are verified and new probable drone
locations are explored. To discover new drone locations, motion boundaries are
used. This is followed by tracking candidate drone detections for a few frames,
cuboid formation, extraction of the 3D convolution feature map, and drones
detection within each cuboid. The proposed approach is evaluated on two
publicly available drone detection datasets and outperforms several competitive
baselines.
Related papers
- Drone-type-Set: Drone types detection benchmark for drone detection and tracking [0.6294091730968154]
In this paper, we provide a dataset of various drones as well as a comparison of recognized object detection models.
The experimental results of different models are provided along with a description of each method.
arXiv Detail & Related papers (2024-05-16T18:56:46Z) - C2FDrone: Coarse-to-Fine Drone-to-Drone Detection using Vision Transformer Networks [23.133250476580038]
A vision-based drone-to-drone detection system is crucial for various applications like collision avoidance, countering hostile drones, and search-and-rescue operations.
detecting drones presents unique challenges, including small object sizes, distortion, and real-time processing requirements.
We propose a novel coarse-to-fine detection strategy based on vision transformers.
arXiv Detail & Related papers (2024-04-30T05:51:21Z) - A Vision-based Autonomous Perching Approach for Nano Aerial Vehicles [0.0]
A vision-based autonomous perching approach for nano quadcopters is proposed.
The drone can successfully perch on the center of markers within two centimeters of precision.
arXiv Detail & Related papers (2023-06-16T02:34:50Z) - Unauthorized Drone Detection: Experiments and Prototypes [0.8294692832460543]
We present a novel encryption-based drone detection scheme that uses a two-stage verification of the drone's received signal strength indicator ( RSSI) and the encryption key generated from the drone's position coordinates.
arXiv Detail & Related papers (2022-12-02T20:43:29Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - A dataset for multi-sensor drone detection [67.75999072448555]
The use of small and remotely controlled unmanned aerial vehicles (UAVs) has increased in recent years.
Most studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the dataset.
We contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files.
arXiv Detail & Related papers (2021-11-02T20:52:03Z) - Vision-based Drone Flocking in Outdoor Environments [9.184987303791292]
This letter proposes a vision-based detection and tracking algorithm for drone swarms.
We employ a convolutional neural network to detect and localize nearby agents onboard the quadcopters in real-time.
We show that the drones can safely navigate in an outdoor environment despite substantial background clutter and difficult lighting conditions.
arXiv Detail & Related papers (2020-12-02T14:44:40Z) - Real-Time Drone Detection and Tracking With Visible, Thermal and
Acoustic Sensors [66.4525391417921]
A thermal infrared camera is shown to be a feasible solution to the drone detection task.
The detector performance as a function of the sensor-to-target distance is also investigated.
A novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented.
arXiv Detail & Related papers (2020-07-14T23:06:42Z) - University-1652: A Multi-view Multi-source Benchmark for Drone-based
Geo-localization [87.74121935246937]
We introduce a new multi-view benchmark for drone-based geo-localization, named University-1652.
University-1652 contains data from three platforms, i.e., synthetic drones, satellites and ground cameras of 1,652 university buildings around the world.
Experiments show that University-1652 helps the model to learn the viewpoint-invariant features and also has good generalization ability in the real-world scenario.
arXiv Detail & Related papers (2020-02-27T15:24:15Z) - Detection and Tracking Meet Drones Challenge [131.31749447313197]
This paper presents a review of object detection and tracking datasets and benchmarks, and discusses the challenges of collecting large-scale drone-based object detection and tracking datasets with manual annotations.
We describe our VisDrone dataset, which is captured over various urban/suburban areas of 14 different cities across China from North to South.
We provide a detailed analysis of the current state of the field of large-scale object detection and tracking on drones, and conclude the challenge as well as propose future directions.
arXiv Detail & Related papers (2020-01-16T00:11:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.