Vision-based Target Pose Estimation with Multiple Markers for the
Perching of UAVs
- URL: http://arxiv.org/abs/2304.14838v1
- Date: Tue, 25 Apr 2023 16:51:10 GMT
- Title: Vision-based Target Pose Estimation with Multiple Markers for the
Perching of UAVs
- Authors: Truong-Dong Do, Nguyen Xuan-Mung and Sung-Kyung Hong
- Abstract summary: In this paper, a vision-based target poses estimation method using multiple markers is proposed.
A perching target with a small marker inside a larger one is designed to improve detection capability at wide and close ranges.
The poses are then sent to the position controller to align the drone and the marker's center and steer it to perch on the target.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Autonomous Nano Aerial Vehicles have been increasingly popular in
surveillance and monitoring operations due to their efficiency and
maneuverability. Once a target location has been reached, drones do not have to
remain active during the mission. It is possible for the vehicle to perch and
stop its motors in such situations to conserve energy, as well as maintain a
static position in unfavorable flying conditions. In the perching target
estimation phase, the steady and accuracy of a visual camera with markers is a
significant challenge. It is rapidly detectable from afar when using a large
marker, but when the drone approaches, it quickly disappears as out of camera
view. In this paper, a vision-based target poses estimation method using
multiple markers is proposed to deal with the above-mentioned problems. First,
a perching target with a small marker inside a larger one is designed to
improve detection capability at wide and close ranges. Second, the relative
poses of the flying vehicle are calculated from detected markers using a
monocular camera. Next, a Kalman filter is applied to provide a more stable and
reliable pose estimation, especially when the measurement data is missing due
to unexpected reasons. Finally, we introduced an algorithm for merging the
poses data from multi markers. The poses are then sent to the position
controller to align the drone and the marker's center and steer it to perch on
the target. The experimental results demonstrated the effectiveness and
feasibility of the adopted approach. The drone can perch successfully onto the
center of the markers with the attached 25mm-diameter rounded magnet.
Related papers
- YoloTag: Vision-based Robust UAV Navigation with Fiducial Markers [2.7855886538423182]
We propose YoloTag, a real-time fiducial marker-based localization system.
YoloTag uses a lightweight YOLO v8 object detector to accurately detect fiducial markers in images.
The detected markers are then used by an efficient perspective-n-point algorithm to estimate UAV states.
arXiv Detail & Related papers (2024-09-03T23:42:19Z) - Detection and tracking of MAVs using a LiDAR with rosette scanning pattern [2.062195473318468]
This work presents a method for the detection and tracking of MAVs using a novel, low-cost rosette scanning LiDAR on a pan-tilt turret.
The tracking makes it possible to keep the MAV in the center, maximizing the density of 3D points measured on the target by the LiDAR sensor.
arXiv Detail & Related papers (2024-08-16T06:40:20Z) - A Vision-based Autonomous Perching Approach for Nano Aerial Vehicles [0.0]
A vision-based autonomous perching approach for nano quadcopters is proposed.
The drone can successfully perch on the center of markers within two centimeters of precision.
arXiv Detail & Related papers (2023-06-16T02:34:50Z) - A dataset for multi-sensor drone detection [67.75999072448555]
The use of small and remotely controlled unmanned aerial vehicles (UAVs) has increased in recent years.
Most studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the dataset.
We contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files.
arXiv Detail & Related papers (2021-11-02T20:52:03Z) - Towards Robust Monocular Visual Odometry for Flying Robots on Planetary
Missions [49.79068659889639]
Ingenuity, that just landed on Mars, will mark the beginning of a new era of exploration unhindered by traversability.
We present an advanced robust monocular odometry algorithm that uses efficient optical flow tracking.
We also present a novel approach to estimate the current risk of scale drift based on a principal component analysis of the relative translation information matrix.
arXiv Detail & Related papers (2021-09-12T12:52:20Z) - Dogfight: Detecting Drones from Drones Videos [58.158988162743825]
This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
arXiv Detail & Related papers (2021-03-31T17:43:31Z) - Detecting Invisible People [58.49425715635312]
We re-purpose tracking benchmarks and propose new metrics for the task of detecting invisible objects.
We demonstrate that current detection and tracking systems perform dramatically worse on this task.
Second, we build dynamic models that explicitly reason in 3D, making use of observations produced by state-of-the-art monocular depth estimation networks.
arXiv Detail & Related papers (2020-12-15T16:54:45Z) - Relative Drone-Ground Vehicle Localization using LiDAR and Fisheye
Cameras through Direct and Indirect Observations [0.0]
We present a LiDAR-camera-based relative pose estimation method between a drone and a ground vehicle.
We propose a dynamically adaptive kernel-based method for drone detection and tracking using the LiDAR.
In our experiments, we were able to achieve very fast initial detection and real-time tracking of the drone.
arXiv Detail & Related papers (2020-11-13T16:41:55Z) - Perceiving Traffic from Aerial Images [86.994032967469]
We propose an object detection method called Butterfly Detector that is tailored to detect objects in aerial images.
We evaluate our Butterfly Detector on two publicly available UAV datasets (UAVDT and VisDrone 2019) and show that it outperforms previous state-of-the-art methods while remaining real-time.
arXiv Detail & Related papers (2020-09-16T11:37:43Z) - Multi-Drone based Single Object Tracking with Agent Sharing Network [74.8198920355117]
Multi-Drone single Object Tracking dataset consists of 92 groups of video clips with 113,918 high resolution frames taken by two drones and 63 groups of video clips with 145,875 high resolution frames taken by three drones.
Agent sharing network (ASNet) is proposed by self-supervised template sharing and view-aware fusion of the target from multiple drones.
arXiv Detail & Related papers (2020-03-16T03:27:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.