Robust Autonomous Landing of UAV in Non-Cooperative Environments based
on Dynamic Time Camera-LiDAR Fusion
- URL: http://arxiv.org/abs/2011.13761v1
- Date: Fri, 27 Nov 2020 14:47:02 GMT
- Title: Robust Autonomous Landing of UAV in Non-Cooperative Environments based
on Dynamic Time Camera-LiDAR Fusion
- Authors: Lyujie Chen, Xiaming Yuan, Yao Xiao, Yiding Zhang and Jihong Zhu
- Abstract summary: We construct a UAV system equipped with low-cost LiDAR and binocular cameras to realize autonomous landing in non-cooperative environments.
Taking advantage of the non-repetitive scanning and high FOV coverage characteristics of LiDAR, we come up with a dynamic time depth completion algorithm.
Based on the depth map, the high-level terrain information such as slope, roughness, and the size of the safe area are derived.
- Score: 11.407952542799526
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Selecting safe landing sites in non-cooperative environments is a key step
towards the full autonomy of UAVs. However, the existing methods have the
common problems of poor generalization ability and robustness. Their
performance in unknown environments is significantly degraded and the error
cannot be self-detected and corrected. In this paper, we construct a UAV system
equipped with low-cost LiDAR and binocular cameras to realize autonomous
landing in non-cooperative environments by detecting the flat and safe ground
area. Taking advantage of the non-repetitive scanning and high FOV coverage
characteristics of LiDAR, we come up with a dynamic time depth completion
algorithm. In conjunction with the proposed self-evaluation method of the depth
map, our model can dynamically select the LiDAR accumulation time at the
inference phase to ensure an accurate prediction result. Based on the depth
map, the high-level terrain information such as slope, roughness, and the size
of the safe area are derived. We have conducted extensive autonomous landing
experiments in a variety of familiar or completely unknown environments,
verifying that our model can adaptively balance the accuracy and speed, and the
UAV can robustly select a safe landing site.
Related papers
- An Efficient Approach to Generate Safe Drivable Space by LiDAR-Camera-HDmap Fusion [13.451123257796972]
We propose an accurate and robust perception module for Autonomous Vehicles (AVs) for drivable space extraction.
Our work introduces a robust easy-to-generalize perception module that leverages LiDAR, camera, and HD map data fusion.
Our approach is tested on a real dataset and its reliability is verified during the daily (including harsh snowy weather) operation of our autonomous shuttle, WATonoBus.
arXiv Detail & Related papers (2024-10-29T17:54:02Z) - Long-Range Vision-Based UAV-assisted Localization for Unmanned Surface Vehicles [7.384309568198598]
Global positioning system (GPS) has become an indispensable navigation method for field operations with unmanned surface vehicles (USVs) in marine environments.
GPS may not always be available outdoors because it is vulnerable to natural interference and malicious jamming attacks.
We present a novel method that utilizes an Unmanned Aerial Vehicle (UAV) to assist in localizing USVs in restricted marine environments.
arXiv Detail & Related papers (2024-08-21T08:37:37Z) - UFO: Uncertainty-aware LiDAR-image Fusion for Off-road Semantic Terrain
Map Estimation [2.048226951354646]
This paper presents a learning-based fusion method for generating dense terrain classification maps in BEV.
Our approach enhances the accuracy of semantic maps generated from an RGB image and a single-sweep LiDAR scan.
arXiv Detail & Related papers (2024-03-05T04:20:03Z) - Angle Robustness Unmanned Aerial Vehicle Navigation in GNSS-Denied
Scenarios [66.05091704671503]
We present a novel angle navigation paradigm to deal with flight deviation in point-to-point navigation tasks.
We also propose a model that includes the Adaptive Feature Enhance Module, Cross-knowledge Attention-guided Module and Robust Task-oriented Head Module.
arXiv Detail & Related papers (2024-02-04T08:41:20Z) - Visual Environment Assessment for Safe Autonomous Quadrotor Landing [8.538463567092297]
We present a novel approach for detection and assessment of potential landing sites for safe quadrotor landing.
Our solution efficiently integrates 2D and 3D environmental information, eliminating the need for external aids such as GPS.
Our approach runs in real-time on quadrotors equipped with limited computational capabilities.
arXiv Detail & Related papers (2023-11-16T18:02:10Z) - VAPOR: Legged Robot Navigation in Outdoor Vegetation Using Offline
Reinforcement Learning [53.13393315664145]
We present VAPOR, a novel method for autonomous legged robot navigation in unstructured, densely vegetated outdoor environments.
Our method trains a novel RL policy using an actor-critic network and arbitrary data collected in real outdoor vegetation.
We observe that VAPOR's actions improve success rates by up to 40%, decrease the average current consumption by up to 2.9%, and decrease the normalized trajectory length by up to 11.2%.
arXiv Detail & Related papers (2023-09-14T16:21:27Z) - Efficient Real-time Smoke Filtration with 3D LiDAR for Search and Rescue
with Autonomous Heterogeneous Robotic Systems [56.838297900091426]
Smoke and dust affect the performance of any mobile robotic platform due to their reliance on onboard perception systems.
This paper proposes a novel modular computation filtration pipeline based on intensity and spatial information.
arXiv Detail & Related papers (2023-08-14T16:48:57Z) - Large-scale Autonomous Flight with Real-time Semantic SLAM under Dense
Forest Canopy [48.51396198176273]
We propose an integrated system that can perform large-scale autonomous flights and real-time semantic mapping in challenging under-canopy environments.
We detect and model tree trunks and ground planes from LiDAR data, which are associated across scans and used to constrain robot poses as well as tree trunk models.
A drift-compensation mechanism is designed to minimize the odometry drift using semantic SLAM outputs in real time, while maintaining planner optimality and controller stability.
arXiv Detail & Related papers (2021-09-14T07:24:53Z) - A Multi-UAV System for Exploration and Target Finding in Cluttered and
GPS-Denied Environments [68.31522961125589]
We propose a framework for a team of UAVs to cooperatively explore and find a target in complex GPS-denied environments with obstacles.
The team of UAVs autonomously navigates, explores, detects, and finds the target in a cluttered environment with a known map.
Results indicate that the proposed multi-UAV system has improvements in terms of time-cost, the proportion of search area surveyed, as well as successful rates for search and rescue missions.
arXiv Detail & Related papers (2021-07-19T12:54:04Z) - Towards robust sensing for Autonomous Vehicles: An adversarial
perspective [82.83630604517249]
It is of primary importance that the resulting decisions are robust to perturbations.
Adversarial perturbations are purposefully crafted alterations of the environment or of the sensory measurements.
A careful evaluation of the vulnerabilities of their sensing system(s) is necessary in order to build and deploy safer systems.
arXiv Detail & Related papers (2020-07-14T05:25:15Z) - Reinforcement Learning for UAV Autonomous Navigation, Mapping and Target
Detection [36.79380276028116]
We study a joint detection, mapping and navigation problem for a single unmanned aerial vehicle (UAV) equipped with a low complexity radar and flying in an unknown environment.
The goal is to optimize its trajectory with the purpose of maximizing the mapping accuracy and to avoid areas where measurements might not be sufficiently informative from the perspective of a target detection.
arXiv Detail & Related papers (2020-05-05T20:39:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.