NeRF-To-Real Tester: Neural Radiance Fields as Test Image Generators for Vision of Autonomous Systems
- URL: http://arxiv.org/abs/2412.16141v1
- Date: Fri, 20 Dec 2024 18:40:53 GMT
- Title: NeRF-To-Real Tester: Neural Radiance Fields as Test Image Generators for Vision of Autonomous Systems
- Authors: Laura Weihl, Bilal Wehbe, Andrzej WÄ…sowski,
- Abstract summary: Overfitting of controllers to simulation conditions leads to poor performance in the operation environment.
We address the challenge of generating perception test data for autonomous systems by leveraging Neural Radiance Fields.
Our tool, N2R-Tester, allows training models of custom scenes and rendering test images from perturbed positions.
- Score: 3.031375888004876
- License:
- Abstract: Autonomous inspection of infrastructure on land and in water is a quickly growing market, with applications including surveying constructions, monitoring plants, and tracking environmental changes in on- and off-shore wind energy farms. For Autonomous Underwater Vehicles and Unmanned Aerial Vehicles overfitting of controllers to simulation conditions fundamentally leads to poor performance in the operation environment. There is a pressing need for more diverse and realistic test data that accurately represents the challenges faced by these systems. We address the challenge of generating perception test data for autonomous systems by leveraging Neural Radiance Fields to generate realistic and diverse test images, and integrating them into a metamorphic testing framework for vision components such as vSLAM and object detection. Our tool, N2R-Tester, allows training models of custom scenes and rendering test images from perturbed positions. An experimental evaluation of N2R-Tester on eight different vision components in AUVs and UAVs demonstrates the efficacy and versatility of the approach.
Related papers
- DroneWiS: Automated Simulation Testing of small Unmanned Aerial Systems in Realistic Windy Conditions [8.290044674335473]
DroneWiS allows sUAS developers to automatically simulate realistic windy conditions and test the resilience of sUAS against wind.
Unlike current state-of-the-art simulation tools such as Gazebo and AirSim, DroneWiS leverages Computational Fluid Dynamics (CFD) to compute the unique wind flows.
This simulation capability provides deeper insights to developers about the navigation capability of sUAS in challenging and realistic windy conditions.
arXiv Detail & Related papers (2024-08-29T14:25:11Z) - Challenges of YOLO Series for Object Detection in Extremely Heavy Rain:
CALRA Simulator based Synthetic Evaluation Dataset [0.0]
Object detection by diverse sensors (e.g., LiDAR, radar, and camera) should be prioritized for autonomous vehicles.
These sensors require to detect objects accurately and quickly in diverse weather conditions, but they tend to have challenges to consistently detect objects in bad weather conditions with rain, snow, or fog.
In this study, based on experimentally obtained raindrop data from precipitation conditions, we constructed a novel dataset that could test diverse network model in various precipitation conditions.
arXiv Detail & Related papers (2023-12-13T08:45:57Z) - Efficient Real-time Smoke Filtration with 3D LiDAR for Search and Rescue
with Autonomous Heterogeneous Robotic Systems [56.838297900091426]
Smoke and dust affect the performance of any mobile robotic platform due to their reliance on onboard perception systems.
This paper proposes a novel modular computation filtration pipeline based on intensity and spatial information.
arXiv Detail & Related papers (2023-08-14T16:48:57Z) - DroneReqValidator: Facilitating High Fidelity Simulation Testing for
Uncrewed Aerial Systems Developers [8.290044674335473]
sUAS developers aim to validate the reliability and safety of their applications through simulation testing.
The dynamic nature of the real-world environment causes unique software faults that may only be revealed through field testing.
DroneReqValidator (DRV) offers a comprehensive small Unmanned Aerial Vehicle (sUAV) simulation ecosystem.
arXiv Detail & Related papers (2023-07-31T22:13:57Z) - A Requirements-Driven Platform for Validating Field Operations of Small
Uncrewed Aerial Vehicles [48.67061953896227]
DroneReqValidator (DRV) allows sUAS developers to define the operating context, configure multi-sUAS mission requirements, specify safety properties, and deploy their own custom sUAS applications in a high-fidelity 3D environment.
The DRV Monitoring system collects runtime data from sUAS and the environment, analyzes compliance with safety properties, and captures violations.
arXiv Detail & Related papers (2023-07-01T02:03:49Z) - Virtual Reality via Object Poses and Active Learning: Realizing
Telepresence Robots with Aerial Manipulation Capabilities [39.29763956979895]
This article presents a novel telepresence system for advancing aerial manipulation in dynamic and unstructured environments.
The proposed system not only features a haptic device, but also a virtual reality (VR) interface that provides real-time 3D displays of the robot's workspace.
We show over 70 robust executions of pick-and-place, force application and peg-in-hole tasks with the DLR cable-Suspended Aerial Manipulator (SAM)
arXiv Detail & Related papers (2022-10-18T08:42:30Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - VISTA 2.0: An Open, Data-driven Simulator for Multimodal Sensing and
Policy Learning for Autonomous Vehicles [131.2240621036954]
We present VISTA, an open source, data-driven simulator that integrates multiple types of sensors for autonomous vehicles.
Using high fidelity, real-world datasets, VISTA represents and simulates RGB cameras, 3D LiDAR, and event-based cameras.
We demonstrate the ability to train and test perception-to-control policies across each of the sensor types and showcase the power of this approach via deployment on a full scale autonomous vehicle.
arXiv Detail & Related papers (2021-11-23T18:58:10Z) - DriveGAN: Towards a Controllable High-Quality Neural Simulation [147.6822288981004]
We introduce a novel high-quality neural simulator referred to as DriveGAN.
DriveGAN achieves controllability by disentangling different components without supervision.
We train DriveGAN on multiple datasets, including 160 hours of real-world driving data.
arXiv Detail & Related papers (2021-04-30T15:30:05Z) - Worsening Perception: Real-time Degradation of Autonomous Vehicle
Perception Performance for Simulation of Adverse Weather Conditions [47.529411576737644]
This study explores the potential of using a simple, lightweight image augmentation system in an autonomous racing vehicle.
With minimal adjustment, the prototype system can replicate the effects of both water droplets on the camera lens, and fading light conditions.
arXiv Detail & Related papers (2021-03-03T23:49:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.