Relative Drone-Ground Vehicle Localization using LiDAR and Fisheye
Cameras through Direct and Indirect Observations
- URL: http://arxiv.org/abs/2011.07008v3
- Date: Tue, 17 Nov 2020 10:37:49 GMT
- Title: Relative Drone-Ground Vehicle Localization using LiDAR and Fisheye
Cameras through Direct and Indirect Observations
- Authors: Jan Hausberg, Ryoichi Ishikawa, Menandro Roxas, Takeshi Oishi
- Abstract summary: We present a LiDAR-camera-based relative pose estimation method between a drone and a ground vehicle.
We propose a dynamically adaptive kernel-based method for drone detection and tracking using the LiDAR.
In our experiments, we were able to achieve very fast initial detection and real-time tracking of the drone.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Estimating the pose of an unmanned aerial vehicle (UAV) or drone is a
challenging task. It is useful for many applications such as navigation,
surveillance, tracking objects on the ground, and 3D reconstruction. In this
work, we present a LiDAR-camera-based relative pose estimation method between a
drone and a ground vehicle, using a LiDAR sensor and a fisheye camera on the
vehicle's roof and another fisheye camera mounted under the drone. The LiDAR
sensor directly observes the drone and measures its position, and the two
cameras estimate the relative orientation using indirect observation of the
surrounding objects. We propose a dynamically adaptive kernel-based method for
drone detection and tracking using the LiDAR. We detect vanishing points in
both cameras and find their correspondences to estimate the relative
orientation. Additionally, we propose a rotation correction technique by
relying on the observed motion of the drone through the LiDAR. In our
experiments, we were able to achieve very fast initial detection and real-time
tracking of the drone. Our method is fully automatic.
Related papers
- 3D Trajectory Reconstruction of Drones using a Single Camera [0.5937476291232799]
We propose a novel framework for reconstructing 3D trajectories of drones using a single camera.
We automatically track the drones in 2D images using the drone tracker and estimate their 2D rotations.
To address the lack of public drone datasets, we also create synthetic 2D and 3D drone datasets.
arXiv Detail & Related papers (2023-09-06T07:39:51Z) - A Vision-based Autonomous Perching Approach for Nano Aerial Vehicles [0.0]
A vision-based autonomous perching approach for nano quadcopters is proposed.
The drone can successfully perch on the center of markers within two centimeters of precision.
arXiv Detail & Related papers (2023-06-16T02:34:50Z) - Vision-based Target Pose Estimation with Multiple Markers for the
Perching of UAVs [0.0]
In this paper, a vision-based target poses estimation method using multiple markers is proposed.
A perching target with a small marker inside a larger one is designed to improve detection capability at wide and close ranges.
The poses are then sent to the position controller to align the drone and the marker's center and steer it to perch on the target.
arXiv Detail & Related papers (2023-04-25T16:51:10Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - A dataset for multi-sensor drone detection [67.75999072448555]
The use of small and remotely controlled unmanned aerial vehicles (UAVs) has increased in recent years.
Most studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the dataset.
We contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files.
arXiv Detail & Related papers (2021-11-02T20:52:03Z) - Dogfight: Detecting Drones from Drones Videos [58.158988162743825]
This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
arXiv Detail & Related papers (2021-03-31T17:43:31Z) - Learn by Observation: Imitation Learning for Drone Patrolling from
Videos of A Human Navigator [22.06785798356346]
We propose to let the drone learn patrolling in the air by observing and imitating how a human navigator does it on the ground.
The observation process enables the automatic collection and annotation of data using inter-frame geometric consistency.
A newly designed neural network is trained based on the annotated data to predict appropriate directions and translations.
arXiv Detail & Related papers (2020-08-30T15:20:40Z) - Real-Time Drone Detection and Tracking With Visible, Thermal and
Acoustic Sensors [66.4525391417921]
A thermal infrared camera is shown to be a feasible solution to the drone detection task.
The detector performance as a function of the sensor-to-target distance is also investigated.
A novel video dataset containing 650 annotated infrared and visible videos of drones, birds, airplanes and helicopters is also presented.
arXiv Detail & Related papers (2020-07-14T23:06:42Z) - Multi-Drone based Single Object Tracking with Agent Sharing Network [74.8198920355117]
Multi-Drone single Object Tracking dataset consists of 92 groups of video clips with 113,918 high resolution frames taken by two drones and 63 groups of video clips with 145,875 high resolution frames taken by three drones.
Agent sharing network (ASNet) is proposed by self-supervised template sharing and view-aware fusion of the target from multiple drones.
arXiv Detail & Related papers (2020-03-16T03:27:04Z) - Detection and Tracking Meet Drones Challenge [131.31749447313197]
This paper presents a review of object detection and tracking datasets and benchmarks, and discusses the challenges of collecting large-scale drone-based object detection and tracking datasets with manual annotations.
We describe our VisDrone dataset, which is captured over various urban/suburban areas of 14 different cities across China from North to South.
We provide a detailed analysis of the current state of the field of large-scale object detection and tracking on drones, and conclude the challenge as well as propose future directions.
arXiv Detail & Related papers (2020-01-16T00:11:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.