Angle Robustness Unmanned Aerial Vehicle Navigation in GNSS-Denied
Scenarios
- URL: http://arxiv.org/abs/2402.02405v1
- Date: Sun, 4 Feb 2024 08:41:20 GMT
- Title: Angle Robustness Unmanned Aerial Vehicle Navigation in GNSS-Denied
Scenarios
- Authors: Yuxin Wang, Zunlei Feng, Haofei Zhang, Yang Gao, Jie Lei, Li Sun,
Mingli Song
- Abstract summary: We present a novel angle navigation paradigm to deal with flight deviation in point-to-point navigation tasks.
We also propose a model that includes the Adaptive Feature Enhance Module, Cross-knowledge Attention-guided Module and Robust Task-oriented Head Module.
- Score: 66.05091704671503
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Due to the inability to receive signals from the Global Navigation Satellite
System (GNSS) in extreme conditions, achieving accurate and robust navigation
for Unmanned Aerial Vehicles (UAVs) is a challenging task. Recently emerged,
vision-based navigation has been a promising and feasible alternative to
GNSS-based navigation. However, existing vision-based techniques are inadequate
in addressing flight deviation caused by environmental disturbances and
inaccurate position predictions in practical settings. In this paper, we
present a novel angle robustness navigation paradigm to deal with flight
deviation in point-to-point navigation tasks. Additionally, we propose a model
that includes the Adaptive Feature Enhance Module, Cross-knowledge
Attention-guided Module and Robust Task-oriented Head Module to accurately
predict direction angles for high-precision navigation. To evaluate the
vision-based navigation methods, we collect a new dataset termed as UAV_AR368.
Furthermore, we design the Simulation Flight Testing Instrument (SFTI) using
Google Earth to simulate different flight environments, thereby reducing the
expenses associated with real flight testing. Experiment results demonstrate
that the proposed model outperforms the state-of-the-art by achieving
improvements of 26.0% and 45.6% in the success rate of arrival under ideal and
disturbed circumstances, respectively.
Related papers
- MPVO: Motion-Prior based Visual Odometry for PointGoal Navigation [3.9974562667271507]
Visual odometry (VO) is essential for enabling accurate point-goal navigation of embodied agents in indoor environments.
Recent deep-learned VO methods show robust performance but suffer from sample inefficiency during training.
We propose a robust and sample-efficient VO pipeline based on motion priors available while an agent is navigating an environment.
arXiv Detail & Related papers (2024-11-07T15:36:49Z) - Long-distance Geomagnetic Navigation in GNSS-denied Environments with Deep Reinforcement Learning [62.186340267690824]
Existing studies on geomagnetic navigation rely on pre-stored map or extensive searches, leading to limited applicability or reduced navigation efficiency in unexplored areas.
This paper develops a deep reinforcement learning (DRL)-based mechanism, especially for long-distance geomagnetic navigation.
The designed mechanism trains an agent to learn and gain the magnetoreception capacity for geomagnetic navigation, rather than using any pre-stored map or extensive and expensive searching approaches.
arXiv Detail & Related papers (2024-10-21T09:57:42Z) - A Bionic Data-driven Approach for Long-distance Underwater Navigation with Anomaly Resistance [59.21686775951903]
Various animals exhibit accurate navigation using environment cues.
Inspired by animal navigation, this work proposes a bionic and data-driven approach for long-distance underwater navigation.
The proposed approach uses measured geomagnetic data for the navigation, and requires no GPS systems or geographical maps.
arXiv Detail & Related papers (2024-02-06T13:20:56Z) - Vision-Based Autonomous Navigation for Unmanned Surface Vessel in
Extreme Marine Conditions [2.8983738640808645]
This paper presents an autonomous vision-based navigation framework for tracking target objects in extreme marine conditions.
The proposed framework has been thoroughly tested in simulation under extremely reduced visibility due to sandstorms and fog.
The results are compared with state-of-the-art de-hazing methods across the benchmarked MBZIRC simulation dataset.
arXiv Detail & Related papers (2023-08-08T14:25:13Z) - ETPNav: Evolving Topological Planning for Vision-Language Navigation in
Continuous Environments [56.194988818341976]
Vision-language navigation is a task that requires an agent to follow instructions to navigate in environments.
We propose ETPNav, which focuses on two critical skills: 1) the capability to abstract environments and generate long-range navigation plans, and 2) the ability of obstacle-avoiding control in continuous environments.
ETPNav yields more than 10% and 20% improvements over prior state-of-the-art on R2R-CE and RxR-CE datasets.
arXiv Detail & Related papers (2023-04-06T13:07:17Z) - Unsupervised Visual Odometry and Action Integration for PointGoal
Navigation in Indoor Environment [14.363948775085534]
PointGoal navigation in indoor environment is a fundamental task for personal robots to navigate to a specified point.
To improve the PointGoal navigation accuracy without GPS signal, we use visual odometry (VO) and propose a novel action integration module (AIM) trained in unsupervised manner.
Experiments show that the proposed system achieves satisfactory results and outperforms the partially supervised learning algorithms on the popular Gibson dataset.
arXiv Detail & Related papers (2022-10-02T03:12:03Z) - The Surprising Effectiveness of Visual Odometry Techniques for Embodied
PointGoal Navigation [100.08270721713149]
PointGoal navigation has been introduced in simulated Embodied AI environments.
Recent advances solve this PointGoal navigation task with near-perfect accuracy (99.6% success)
We show that integrating visual odometry techniques into navigation policies improves the state-of-the-art on the popular Habitat PointNav benchmark by a large margin.
arXiv Detail & Related papers (2021-08-26T02:12:49Z) - Occupancy Anticipation for Efficient Exploration and Navigation [97.17517060585875]
We propose occupancy anticipation, where the agent uses its egocentric RGB-D observations to infer the occupancy state beyond the visible regions.
By exploiting context in both the egocentric views and top-down maps our model successfully anticipates a broader map of the environment.
Our approach is the winning entry in the 2020 Habitat PointNav Challenge.
arXiv Detail & Related papers (2020-08-21T03:16:51Z) - Virtual Testbed for Monocular Visual Navigation of Small Unmanned
Aircraft Systems [0.0]
This work presents a virtual testbed for conducting simulated flight tests over real-world terrain.
It analyzes the real-time performance of visual navigation algorithms at 31 Hz.
This tool was created to find a visual odometry algorithm appropriate for further GPS-denied navigation research on fixed-wing aircraft.
arXiv Detail & Related papers (2020-07-01T20:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.