3D Trajectory Reconstruction of Drones using a Single Camera
- URL: http://arxiv.org/abs/2309.02801v1
- Date: Wed, 6 Sep 2023 07:39:51 GMT
- Title: 3D Trajectory Reconstruction of Drones using a Single Camera
- Authors: Seobin Hwang, Hanyoung Kim, Chaeyeon Heo, Youkyoung Na, Cheongeun Lee,
and Yeongjun Cho
- Abstract summary: We propose a novel framework for reconstructing 3D trajectories of drones using a single camera.
We automatically track the drones in 2D images using the drone tracker and estimate their 2D rotations.
To address the lack of public drone datasets, we also create synthetic 2D and 3D drone datasets.
- Score: 0.5937476291232799
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Drones have been widely utilized in various fields, but the number of drones
being used illegally and for hazardous purposes has increased recently. To
prevent those illegal drones, in this work, we propose a novel framework for
reconstructing 3D trajectories of drones using a single camera. By leveraging
calibrated cameras, we exploit the relationship between 2D and 3D spaces. We
automatically track the drones in 2D images using the drone tracker and
estimate their 2D rotations. By combining the estimated 2D drone positions with
their actual length information and camera parameters, we geometrically infer
the 3D trajectories of the drones. To address the lack of public drone
datasets, we also create synthetic 2D and 3D drone datasets. The experimental
results show that the proposed methods accurately reconstruct drone
trajectories in 3D space, and demonstrate the potential of our framework for
single camera-based surveillance systems.
Related papers
- Chasing the Intruder: A Reinforcement Learning Approach for Tracking
Intruder Drones [0.08192907805418582]
We propose a reinforcement learning based approach for identifying and tracking any intruder drone using a chaser drone.
Our proposed solution uses computer vision techniques interleaved with the policy learning framework of reinforcement learning.
The results show that the reinforcement learning based policy converges to identify and track the intruder drone.
arXiv Detail & Related papers (2023-09-10T16:31:40Z) - Sound-based drone fault classification using multitask learning [7.726132010393797]
This paper proposes a sound-based deep neural network (DNN) fault classifier and drone sound dataset.
The dataset was constructed by collecting the operating sounds of drones from microphones mounted on three different drones in an anechoic chamber.
Using the acquired dataset, we train a classifier, 1DCNN-ResNet, that classifies the types of mechanical faults and their locations from short-time input waveforms.
arXiv Detail & Related papers (2023-04-23T17:55:40Z) - Unauthorized Drone Detection: Experiments and Prototypes [0.8294692832460543]
We present a novel encryption-based drone detection scheme that uses a two-stage verification of the drone's received signal strength indicator ( RSSI) and the encryption key generated from the drone's position coordinates.
arXiv Detail & Related papers (2022-12-02T20:43:29Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - A dataset for multi-sensor drone detection [67.75999072448555]
The use of small and remotely controlled unmanned aerial vehicles (UAVs) has increased in recent years.
Most studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the dataset.
We contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files.
arXiv Detail & Related papers (2021-11-02T20:52:03Z) - EVPropNet: Detecting Drones By Finding Propellers For Mid-Air Landing
And Following [11.79762223888294]
Drone propellers are the fastest moving parts of an image and cannot be directly "seen" by a classical camera without severe motion blur.
We train a deep neural network called EVPropNet to detect propellers from the data of an event camera.
We present two applications of our network: (a) tracking and following an unmarked drone and (b) landing on a near-hover drone.
arXiv Detail & Related papers (2021-06-29T01:16:01Z) - Dogfight: Detecting Drones from Drones Videos [58.158988162743825]
This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
arXiv Detail & Related papers (2021-03-31T17:43:31Z) - Relative Drone-Ground Vehicle Localization using LiDAR and Fisheye
Cameras through Direct and Indirect Observations [0.0]
We present a LiDAR-camera-based relative pose estimation method between a drone and a ground vehicle.
We propose a dynamically adaptive kernel-based method for drone detection and tracking using the LiDAR.
In our experiments, we were able to achieve very fast initial detection and real-time tracking of the drone.
arXiv Detail & Related papers (2020-11-13T16:41:55Z) - Multi-Drone based Single Object Tracking with Agent Sharing Network [74.8198920355117]
Multi-Drone single Object Tracking dataset consists of 92 groups of video clips with 113,918 high resolution frames taken by two drones and 63 groups of video clips with 145,875 high resolution frames taken by three drones.
Agent sharing network (ASNet) is proposed by self-supervised template sharing and view-aware fusion of the target from multiple drones.
arXiv Detail & Related papers (2020-03-16T03:27:04Z) - University-1652: A Multi-view Multi-source Benchmark for Drone-based
Geo-localization [87.74121935246937]
We introduce a new multi-view benchmark for drone-based geo-localization, named University-1652.
University-1652 contains data from three platforms, i.e., synthetic drones, satellites and ground cameras of 1,652 university buildings around the world.
Experiments show that University-1652 helps the model to learn the viewpoint-invariant features and also has good generalization ability in the real-world scenario.
arXiv Detail & Related papers (2020-02-27T15:24:15Z) - Detection and Tracking Meet Drones Challenge [131.31749447313197]
This paper presents a review of object detection and tracking datasets and benchmarks, and discusses the challenges of collecting large-scale drone-based object detection and tracking datasets with manual annotations.
We describe our VisDrone dataset, which is captured over various urban/suburban areas of 14 different cities across China from North to South.
We provide a detailed analysis of the current state of the field of large-scale object detection and tracking on drones, and conclude the challenge as well as propose future directions.
arXiv Detail & Related papers (2020-01-16T00:11:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.