DroneWiS: Automated Simulation Testing of small Unmanned Aerial Systems in Realistic Windy Conditions
- URL: http://arxiv.org/abs/2408.16559v2
- Date: Wed, 25 Sep 2024 14:23:50 GMT
- Title: DroneWiS: Automated Simulation Testing of small Unmanned Aerial Systems in Realistic Windy Conditions
- Authors: Bohan Zhang, Ankit Agrawal,
- Abstract summary: DroneWiS allows sUAS developers to automatically simulate realistic windy conditions and test the resilience of sUAS against wind.
Unlike current state-of-the-art simulation tools such as Gazebo and AirSim, DroneWiS leverages Computational Fluid Dynamics (CFD) to compute the unique wind flows.
This simulation capability provides deeper insights to developers about the navigation capability of sUAS in challenging and realistic windy conditions.
- Score: 8.290044674335473
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The continuous evolution of small Unmanned Aerial Systems (sUAS) demands advanced testing methodologies to ensure their safe and reliable operations in the real-world. To push the boundaries of sUAS simulation testing in realistic environments, we previously developed the DroneReqValidator (DRV) platform, allowing developers to automatically conduct simulation testing in digital twin of earth. In this paper, we present DRV 2.0, which introduces a novel component called DroneWiS (Drone Wind Simulation). DroneWiS allows sUAS developers to automatically simulate realistic windy conditions and test the resilience of sUAS against wind. Unlike current state-of-the-art simulation tools such as Gazebo and AirSim that only simulate basic wind conditions, DroneWiS leverages Computational Fluid Dynamics (CFD) to compute the unique wind flows caused by the interaction of wind with the objects in the environment such as buildings and uneven terrains. This simulation capability provides deeper insights to developers about the navigation capability of sUAS in challenging and realistic windy conditions. DroneWiS equips sUAS developers with a powerful tool to test, debug, and improve the reliability and safety of sUAS in real-world. A working demonstration is available at https://youtu.be/khBHEBST8Wc
Related papers
- FlightForge: Advancing UAV Research with Procedural Generation of High-Fidelity Simulation and Integrated Autonomy [2.6003704171754416]
We propose the novel FlightForge UAV open-source simulator.
It offers advanced rendering capabilities, diverse control modalities, and, foremost, procedural generation of environments.
The simulator is already integrated with a fully autonomous UAV system capable of long-range flights in cluttered unknown environments.
arXiv Detail & Related papers (2025-02-07T16:05:17Z) - NeRF-To-Real Tester: Neural Radiance Fields as Test Image Generators for Vision of Autonomous Systems [3.031375888004876]
Overfitting of controllers to simulation conditions leads to poor performance in the operation environment.
We address the challenge of generating perception test data for autonomous systems by leveraging Neural Radiance Fields.
Our tool, N2R-Tester, allows training models of custom scenes and rendering test images from perturbed positions.
arXiv Detail & Related papers (2024-12-20T18:40:53Z) - DrivingSphere: Building a High-fidelity 4D World for Closed-loop Simulation [54.02069690134526]
We propose DrivingSphere, a realistic and closed-loop simulation framework.
Its core idea is to build 4D world representation and generate real-life and controllable driving scenarios.
By providing a dynamic and realistic simulation environment, DrivingSphere enables comprehensive testing and validation of autonomous driving algorithms.
arXiv Detail & Related papers (2024-11-18T03:00:33Z) - Lander.AI: Adaptive Landing Behavior Agent for Expertise in 3D Dynamic
Platform Landings [2.5022287664959446]
This study introduces an advanced Deep Reinforcement Learning (DRL) agent, Lander:AI, designed to navigate and land on platforms in the presence of windy conditions.
Lander:AI is rigorously trained within the gym-pybullet-drone simulation, an environment that mirrors real-world complexities, including wind turbulence.
The experimental results showcased Lander:AI's high-precision landing and its ability to adapt to moving platforms, even under wind-induced disturbances.
arXiv Detail & Related papers (2024-03-11T10:20:44Z) - UniSim: A Neural Closed-Loop Sensor Simulator [76.79818601389992]
We present UniSim, a neural sensor simulator that takes a single recorded log captured by a sensor-equipped vehicle.
UniSim builds neural feature grids to reconstruct both the static background and dynamic actors in the scene.
We incorporate learnable priors for dynamic objects, and leverage a convolutional network to complete unseen regions.
arXiv Detail & Related papers (2023-08-03T17:56:06Z) - DroneReqValidator: Facilitating High Fidelity Simulation Testing for
Uncrewed Aerial Systems Developers [8.290044674335473]
sUAS developers aim to validate the reliability and safety of their applications through simulation testing.
The dynamic nature of the real-world environment causes unique software faults that may only be revealed through field testing.
DroneReqValidator (DRV) offers a comprehensive small Unmanned Aerial Vehicle (sUAV) simulation ecosystem.
arXiv Detail & Related papers (2023-07-31T22:13:57Z) - Residual Physics Learning and System Identification for Sim-to-real
Transfer of Policies on Buoyancy Assisted Legged Robots [14.760426243769308]
In this work, we demonstrate robust sim-to-real transfer of control policies on the BALLU robots via system identification.
Rather than relying on standard supervised learning formulations, we utilize deep reinforcement learning to train an external force policy.
We analyze the improved simulation fidelity by comparing the simulation trajectories against the real-world ones.
arXiv Detail & Related papers (2023-03-16T18:49:05Z) - Generative AI-empowered Simulation for Autonomous Driving in Vehicular
Mixed Reality Metaverses [130.15554653948897]
In vehicular mixed reality (MR) Metaverse, distance between physical and virtual entities can be overcome.
Large-scale traffic and driving simulation via realistic data collection and fusion from the physical world is difficult and costly.
We propose an autonomous driving architecture, where generative AI is leveraged to synthesize unlimited conditioned traffic and driving data in simulations.
arXiv Detail & Related papers (2023-02-16T16:54:10Z) - Neural-Fly Enables Rapid Learning for Agile Flight in Strong Winds [96.74836678572582]
We present a learning-based approach that allows rapid online adaptation by incorporating pretrained representations through deep learning.
Neural-Fly achieves precise flight control with substantially smaller tracking error than state-of-the-art nonlinear and adaptive controllers.
arXiv Detail & Related papers (2022-05-13T21:55:28Z) - VISTA 2.0: An Open, Data-driven Simulator for Multimodal Sensing and
Policy Learning for Autonomous Vehicles [131.2240621036954]
We present VISTA, an open source, data-driven simulator that integrates multiple types of sensors for autonomous vehicles.
Using high fidelity, real-world datasets, VISTA represents and simulates RGB cameras, 3D LiDAR, and event-based cameras.
We demonstrate the ability to train and test perception-to-control policies across each of the sensor types and showcase the power of this approach via deployment on a full scale autonomous vehicle.
arXiv Detail & Related papers (2021-11-23T18:58:10Z) - DriveGAN: Towards a Controllable High-Quality Neural Simulation [147.6822288981004]
We introduce a novel high-quality neural simulator referred to as DriveGAN.
DriveGAN achieves controllability by disentangling different components without supervision.
We train DriveGAN on multiple datasets, including 160 hours of real-world driving data.
arXiv Detail & Related papers (2021-04-30T15:30:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.