Autonomous Systems: Autonomous Systems: Indoor Drone Navigation
- URL: http://arxiv.org/abs/2304.08893v1
- Date: Tue, 18 Apr 2023 10:40:00 GMT
- Title: Autonomous Systems: Autonomous Systems: Indoor Drone Navigation
- Authors: Aswin Iyer, Santosh Narayan, Naren M, Manoj kumar Rajagopal
- Abstract summary: The system creates a simulated quadcopter capable of travelling autonomously in an indoor environment.
The goal is to use the slam toolbox for ROS and the Nav2 navigation system framework to construct a simulated drone.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Drones are a promising technology for autonomous data collection and indoor
sensing. In situations when human-controlled UAVs may not be practical or
dependable, such as in uncharted or dangerous locations, the usage of
autonomous UAVs offers flexibility, cost savings, and reduced risk. The system
creates a simulated quadcopter capable of autonomously travelling in an indoor
environment using the gazebo simulation tool and the ros navigation system
framework known as Navigaation2. While Nav2 has successfully shown the
functioning of autonomous navigation in terrestrial robots and vehicles, the
same hasn't been accomplished with unmanned aerial vehicles and still has to be
done. The goal is to use the slam toolbox for ROS and the Nav2 navigation
system framework to construct a simulated drone that can move autonomously in
an indoor (gps-less) environment.
Related papers
- Indoor Localization for Autonomous Robot Navigation [0.0]
This paper explores using indoor positioning systems (IPSs) for the indoor navigation of an autonomous robot.
We developed an A* path-planning algorithm so that our robot could navigate itself using predicted directions.
After testing different network structures, our robot was able to successfully navigate corners around 50 percent of the time.
arXiv Detail & Related papers (2025-02-28T05:25:04Z) - Enhancing Autonomous Navigation by Imaging Hidden Objects using Single-Photon LiDAR [12.183773707869069]
We present a novel approach that leverages Non-Line-of-Sight (NLOS) sensing using single-photon LiDAR to improve visibility and enhance autonomous navigation.
Our method enables mobile robots to "see around corners" by utilizing multi-bounce light information.
arXiv Detail & Related papers (2024-10-04T16:03:13Z) - Relative Positioning for Aerial Robot Path Planning in GPS Denied Environment [0.0]
This paper tackles one of the most important factors in autonomous UAV navigation, namely Initial Positioning sometimes called Localisation.
It will enable a team of autonomous UAVs to establish a relative position to their base of operation to be able to commence a team search and reconnaissance in a bushfire-affected area.
arXiv Detail & Related papers (2024-09-16T11:35:39Z) - AI and Machine Learning Driven Indoor Localization and Navigation with Mobile Embedded Systems [3.943289808718775]
This article provides an overview of the challenges facing state-of-the-art indoor navigation solutions.
It then describes how AI algorithms deployed on mobile embedded systems can overcome these challenges.
arXiv Detail & Related papers (2024-08-09T00:30:22Z) - Learning Robust Autonomous Navigation and Locomotion for Wheeled-Legged Robots [50.02055068660255]
Navigating urban environments poses unique challenges for robots, necessitating innovative solutions for locomotion and navigation.
This work introduces a fully integrated system comprising adaptive locomotion control, mobility-aware local navigation planning, and large-scale path planning within the city.
Using model-free reinforcement learning (RL) techniques and privileged learning, we develop a versatile locomotion controller.
Our controllers are integrated into a large-scale urban navigation system and validated by autonomous, kilometer-scale navigation missions conducted in Zurich, Switzerland, and Seville, Spain.
arXiv Detail & Related papers (2024-05-03T00:29:20Z) - ETPNav: Evolving Topological Planning for Vision-Language Navigation in
Continuous Environments [56.194988818341976]
Vision-language navigation is a task that requires an agent to follow instructions to navigate in environments.
We propose ETPNav, which focuses on two critical skills: 1) the capability to abstract environments and generate long-range navigation plans, and 2) the ability of obstacle-avoiding control in continuous environments.
ETPNav yields more than 10% and 20% improvements over prior state-of-the-art on R2R-CE and RxR-CE datasets.
arXiv Detail & Related papers (2023-04-06T13:07:17Z) - Towards a Fully Autonomous UAV Controller for Moving Platform Detection
and Landing [2.7909470193274593]
We present an autonomous UAV landing system for landing on a moving platform.
The proposed system relies only on the camera sensor, and has been designed as lightweight as possible.
The system was evaluated with an average deviation of 15cm from the center of the target, for 40 landing attempts.
arXiv Detail & Related papers (2022-09-30T09:16:04Z) - Autonomous Aerial Robot for High-Speed Search and Intercept Applications [86.72321289033562]
A fully-autonomous aerial robot for high-speed object grasping has been proposed.
As an additional sub-task, our system is able to autonomously pierce balloons located in poles close to the surface.
Our approach has been validated in a challenging international competition and has shown outstanding results.
arXiv Detail & Related papers (2021-12-10T11:49:51Z) - Coupling Vision and Proprioception for Navigation of Legged Robots [65.59559699815512]
We exploit the complementary strengths of vision and proprioception to achieve point goal navigation in a legged robot.
We show superior performance compared to wheeled robot (LoCoBot) baselines.
We also show the real-world deployment of our system on a quadruped robot with onboard sensors and compute.
arXiv Detail & Related papers (2021-12-03T18:59:59Z) - A Multi-UAV System for Exploration and Target Finding in Cluttered and
GPS-Denied Environments [68.31522961125589]
We propose a framework for a team of UAVs to cooperatively explore and find a target in complex GPS-denied environments with obstacles.
The team of UAVs autonomously navigates, explores, detects, and finds the target in a cluttered environment with a known map.
Results indicate that the proposed multi-UAV system has improvements in terms of time-cost, the proportion of search area surveyed, as well as successful rates for search and rescue missions.
arXiv Detail & Related papers (2021-07-19T12:54:04Z) - AutoSOS: Towards Multi-UAV Systems Supporting Maritime Search and Rescue
with Lightweight AI and Edge Computing [27.15999421608932]
This paper presents the research directions of the AutoSOS project, where we work in the development of an autonomous multi-robot search and rescue assistance platform.
The platform is meant to perform reconnaissance missions for initial assessment of the environment using novel adaptive deep learning algorithms.
When drones find potential objects, they will send their sensor data to the vessel to verity the findings with increased accuracy.
arXiv Detail & Related papers (2020-05-07T12:22:15Z) - APPLD: Adaptive Planner Parameter Learning from Demonstration [48.63930323392909]
We introduce APPLD, Adaptive Planner Learning from Demonstration, that allows existing navigation systems to be successfully applied to new complex environments.
APPLD is verified on two robots running different navigation systems in different environments.
Experimental results show that APPLD can outperform navigation systems with the default and expert-tuned parameters, and even the human demonstrator themselves.
arXiv Detail & Related papers (2020-03-31T21:15:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.