A Vision-based Autonomous Perching Approach for Nano Aerial Vehicles
- URL: http://arxiv.org/abs/2306.09591v1
- Date: Fri, 16 Jun 2023 02:34:50 GMT
- Title: A Vision-based Autonomous Perching Approach for Nano Aerial Vehicles
- Authors: Truong-Dong Do, Sung Kyung Hong
- Abstract summary: A vision-based autonomous perching approach for nano quadcopters is proposed.
The drone can successfully perch on the center of markers within two centimeters of precision.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Over the past decades, quadcopters have been investigated, due to their
mobility and flexibility to operate in a wide range of environments. They have
been used in various areas, including surveillance and monitoring. During a
mission, drones do not have to remain active once they have reached a target
location. To conserve energy and maintain a static position, it is possible to
perch and stop the motors in such situations. The problem of achieving a
reliable and highly accurate perching method remains a challenge and promising.
In this paper, a vision-based autonomous perching approach for nano quadcopters
onto a predefined perching target on horizontal surfaces is proposed. First, a
perching target with a small marker inside a larger one is designed to improve
detection capability at a variety of ranges. Second, a monocular camera is used
to calculate the relative poses of the flying vehicle from the markers
detected. Then, a Kalman filter is applied to determine the pose more reliably,
especially when measurement data is missing. Next, we introduce an algorithm
for merging the pose data from multiple markers. Finally, the poses are sent to
the perching planner to conduct the real flight test to align the drone with
the target's center and steer it there. Based on the experimental results, the
approach proved to be effective and feasible. The drone can successfully perch
on the center of markers within two centimeters of precision.
Related papers
- Detection and tracking of MAVs using a LiDAR with rosette scanning pattern [2.062195473318468]
This work presents a method for the detection and tracking of MAVs using a novel, low-cost rosette scanning LiDAR on a pan-tilt turret.
The tracking makes it possible to keep the MAV in the center, maximizing the density of 3D points measured on the target by the LiDAR sensor.
arXiv Detail & Related papers (2024-08-16T06:40:20Z) - Vision-based Target Pose Estimation with Multiple Markers for the
Perching of UAVs [0.0]
In this paper, a vision-based target poses estimation method using multiple markers is proposed.
A perching target with a small marker inside a larger one is designed to improve detection capability at wide and close ranges.
The poses are then sent to the position controller to align the drone and the marker's center and steer it to perch on the target.
arXiv Detail & Related papers (2023-04-25T16:51:10Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - A dataset for multi-sensor drone detection [67.75999072448555]
The use of small and remotely controlled unmanned aerial vehicles (UAVs) has increased in recent years.
Most studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the dataset.
We contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files.
arXiv Detail & Related papers (2021-11-02T20:52:03Z) - Towards Robust Monocular Visual Odometry for Flying Robots on Planetary
Missions [49.79068659889639]
Ingenuity, that just landed on Mars, will mark the beginning of a new era of exploration unhindered by traversability.
We present an advanced robust monocular odometry algorithm that uses efficient optical flow tracking.
We also present a novel approach to estimate the current risk of scale drift based on a principal component analysis of the relative translation information matrix.
arXiv Detail & Related papers (2021-09-12T12:52:20Z) - Dogfight: Detecting Drones from Drones Videos [58.158988162743825]
This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
arXiv Detail & Related papers (2021-03-31T17:43:31Z) - Relative Drone-Ground Vehicle Localization using LiDAR and Fisheye
Cameras through Direct and Indirect Observations [0.0]
We present a LiDAR-camera-based relative pose estimation method between a drone and a ground vehicle.
We propose a dynamically adaptive kernel-based method for drone detection and tracking using the LiDAR.
In our experiments, we were able to achieve very fast initial detection and real-time tracking of the drone.
arXiv Detail & Related papers (2020-11-13T16:41:55Z) - Perceiving Traffic from Aerial Images [86.994032967469]
We propose an object detection method called Butterfly Detector that is tailored to detect objects in aerial images.
We evaluate our Butterfly Detector on two publicly available UAV datasets (UAVDT and VisDrone 2019) and show that it outperforms previous state-of-the-art methods while remaining real-time.
arXiv Detail & Related papers (2020-09-16T11:37:43Z) - Distributed Variable-Baseline Stereo SLAM from two UAVs [17.513645771137178]
In this article, we employ two UAVs equipped with one monocular camera and one IMU each, to exploit their view overlap and relative distance measurements.
In order to control the glsuav agents autonomously, we propose a decentralized collaborative estimation scheme.
We demonstrate the effectiveness of the approach at high altitude flights of up to 160m, going significantly beyond the capabilities of state-of-the-art VIO methods.
arXiv Detail & Related papers (2020-09-10T12:16:10Z) - Detection and Tracking Meet Drones Challenge [131.31749447313197]
This paper presents a review of object detection and tracking datasets and benchmarks, and discusses the challenges of collecting large-scale drone-based object detection and tracking datasets with manual annotations.
We describe our VisDrone dataset, which is captured over various urban/suburban areas of 14 different cities across China from North to South.
We provide a detailed analysis of the current state of the field of large-scale object detection and tracking on drones, and conclude the challenge as well as propose future directions.
arXiv Detail & Related papers (2020-01-16T00:11:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.