AutoDrone: Shortest Optimized Obstacle-Free Path Planning for Autonomous
Drones
- URL: http://arxiv.org/abs/2111.00200v1
- Date: Sat, 30 Oct 2021 07:52:57 GMT
- Title: AutoDrone: Shortest Optimized Obstacle-Free Path Planning for Autonomous
Drones
- Authors: Prithwish Jana, Debasish Jana
- Abstract summary: We propose a method to find out an obstacle-free shortest path in the coordinate system guided by GPS.
This can be especially beneficial in rescue operations and fast delivery or pick-up in an energy-efficient way.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With technological advancement, drone has emerged as unmanned aerial vehicle
that can be controlled by humans to fly or reach a destination. This may be
autonomous as well, where the drone itself is intelligent enough to find a
shortest obstacle-free path to reach the destination from a designated source.
Be it a planned smart city or even a wreckage site affected by natural
calamity, we may imagine the buildings, any surface-erected structure or other
blockage as obstacles for the drone to fly in a direct line-of-sight path. So,
the whole bird's eye-view of the landscape can be transformed to a graph of
grid-cells, where some are occupied to indicate the obstacles and some are free
to indicate the free path. The autonomous drone (AutoDrone) will be able to
find out the shortest hindrance-free path while travelling in two-dimensional
space and move from one place to another. In this paper, we propose a method to
find out an obstacle-free shortest path in the coordinate system guided by GPS.
This can be especially beneficial in rescue operations and fast delivery or
pick-up in an energy-efficient way, where our algorithm will help in finding
out the shortest path and angle along which it should fly. Our work shows
different scenarios to path-tracing, through the shortest feasible path
computed by the autonomous drone.
Related papers
- Exploring Jamming and Hijacking Attacks for Micro Aerial Drones [14.970216072065861]
The Crazyflie ecosystem is one of the most popular Micro Aerial Drones and has the potential to be deployed worldwide.
In this paper, we empirically investigate two interference attacks against the Crazy Real Time Protocol (CRTP) implemented within the Crazyflie drones.
Our experimental results demonstrate the effectiveness of such attacks in both autonomous and non-autonomous flight modes.
arXiv Detail & Related papers (2024-03-06T17:09:27Z) - ORBSLAM3-Enhanced Autonomous Toy Drones: Pioneering Indoor Exploration [30.334482597992455]
Navigating toy drones through uncharted GPS-denied indoor spaces poses significant difficulties.
We introduce a real-time autonomous indoor exploration system tailored for drones equipped with a monocular emphRGB camera.
Our system utilizes emphORB-SLAM3, a state-of-the-art vision feature-based SLAM, to handle both the localization of toy drones and the mapping of unmapped indoor terrains.
arXiv Detail & Related papers (2023-12-20T19:20:26Z) - POA: Passable Obstacles Aware Path-planning Algorithm for Navigation of
a Two-wheeled Robot in Highly Cluttered Environments [53.41594627336511]
Passable Obstacles Aware (POA) planner is a novel navigation method for two-wheeled robots in a cluttered environment.
Our algorithm allows two-wheeled robots to find a path through passable obstacles.
arXiv Detail & Related papers (2023-07-16T19:44:27Z) - TransVisDrone: Spatio-Temporal Transformer for Vision-based
Drone-to-Drone Detection in Aerial Videos [57.92385818430939]
Drone-to-drone detection using visual feed has crucial applications, such as detecting drone collisions, detecting drone attacks, or coordinating flight with other drones.
Existing methods are computationally costly, follow non-end-to-end optimization, and have complex multi-stage pipelines, making them less suitable for real-time deployment on edge devices.
We propose a simple yet effective framework, itTransVisDrone, that provides an end-to-end solution with higher computational efficiency.
arXiv Detail & Related papers (2022-10-16T03:05:13Z) - Coupling Vision and Proprioception for Navigation of Legged Robots [65.59559699815512]
We exploit the complementary strengths of vision and proprioception to achieve point goal navigation in a legged robot.
We show superior performance compared to wheeled robot (LoCoBot) baselines.
We also show the real-world deployment of our system on a quadruped robot with onboard sensors and compute.
arXiv Detail & Related papers (2021-12-03T18:59:59Z) - Dogfight: Detecting Drones from Drones Videos [58.158988162743825]
This paper attempts to address the problem of drones detection from other flying drones variations.
The erratic movement of the source and target drones, small size, arbitrary shape, large intensity, and occlusion make this problem quite challenging.
To handle this, instead of using region-proposal based methods, we propose to use a two-stage segmentation-based approach.
arXiv Detail & Related papers (2021-03-31T17:43:31Z) - AlphaPilot: Autonomous Drone Racing [47.205375478625776]
The system has successfully been deployed at the first autonomous drone racing world championship: the 2019 AlphaPilot Challenge.
The proposed system has been demonstrated to successfully guide the drone through tight race courses reaching speeds up to 8m/s.
arXiv Detail & Related papers (2020-05-26T15:45:05Z) - University-1652: A Multi-view Multi-source Benchmark for Drone-based
Geo-localization [87.74121935246937]
We introduce a new multi-view benchmark for drone-based geo-localization, named University-1652.
University-1652 contains data from three platforms, i.e., synthetic drones, satellites and ground cameras of 1,652 university buildings around the world.
Experiments show that University-1652 helps the model to learn the viewpoint-invariant features and also has good generalization ability in the real-world scenario.
arXiv Detail & Related papers (2020-02-27T15:24:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.