Worsening Perception: Real-time Degradation of Autonomous Vehicle
Perception Performance for Simulation of Adverse Weather Conditions
- URL: http://arxiv.org/abs/2103.02760v1
- Date: Wed, 3 Mar 2021 23:49:02 GMT
- Title: Worsening Perception: Real-time Degradation of Autonomous Vehicle
Perception Performance for Simulation of Adverse Weather Conditions
- Authors: Ivan Fursa, Elias Fandi, Valentina Musat, Jacob Culley, Enric Gil,
Louise Bilous, Isaac Vander Sluis, Alexander Rast and Andrew Bradley
- Abstract summary: This study explores the potential of using a simple, lightweight image augmentation system in an autonomous racing vehicle.
With minimal adjustment, the prototype system can replicate the effects of both water droplets on the camera lens, and fading light conditions.
- Score: 47.529411576737644
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Autonomous vehicles rely heavily upon their perception subsystems to see the
environment in which they operate. Unfortunately, the effect of varying weather
conditions presents a significant challenge to object detection algorithms, and
thus it is imperative to test the vehicle extensively in all conditions which
it may experience. However, unpredictable weather can make real-world testing
in adverse conditions an expensive and time consuming task requiring access to
specialist facilities, and weatherproofing of sensitive electronics. Simulation
provides an alternative to real world testing, with some studies developing
increasingly visually realistic representations of the real world on powerful
compute hardware. Given that subsequent subsystems in the autonomous vehicle
pipeline are unaware of the visual realism of the simulation, when developing
modules downstream of perception the appearance is of little consequence -
rather it is how the perception system performs in the prevailing weather
condition that is important. This study explores the potential of using a
simple, lightweight image augmentation system in an autonomous racing vehicle -
focusing not on visual accuracy, but rather the effect upon perception system
performance. With minimal adjustment, the prototype system developed in this
study can replicate the effects of both water droplets on the camera lens, and
fading light conditions. The system introduces a latency of less than 8 ms
using compute hardware that is well suited to being carried in the vehicle -
rendering it ideally suited to real-time implementation that can be run during
experiments in simulation, and augmented reality testing in the real world.
Related papers
- DrivingSphere: Building a High-fidelity 4D World for Closed-loop Simulation [54.02069690134526]
We propose DrivingSphere, a realistic and closed-loop simulation framework.
Its core idea is to build 4D world representation and generate real-life and controllable driving scenarios.
By providing a dynamic and realistic simulation environment, DrivingSphere enables comprehensive testing and validation of autonomous driving algorithms.
arXiv Detail & Related papers (2024-11-18T03:00:33Z) - CADSim: Robust and Scalable in-the-wild 3D Reconstruction for
Controllable Sensor Simulation [44.83732884335725]
Sensor simulation involves modeling traffic participants, such as vehicles, with high quality appearance and articulated geometry.
Current reconstruction approaches struggle on in-the-wild sensor data, due to its sparsity and noise.
We present CADSim, which combines part-aware object-class priors via a small set of CAD models with differentiable rendering to automatically reconstruct vehicle geometry.
arXiv Detail & Related papers (2023-11-02T17:56:59Z) - Using simulation to quantify the performance of automotive perception
systems [2.2320512724449233]
We describe the image system simulation software tools that we use to evaluate the performance of image systems for object (automobile) detection.
We quantified system performance by measuring average precision and we report a trend relating system resolution and object detection performance.
arXiv Detail & Related papers (2023-03-02T05:28:35Z) - Data generation using simulation technology to improve perception
mechanism of autonomous vehicles [0.0]
We will demonstrate the effectiveness of combining data gathered from the real world with data generated in the simulated world to train perception systems.
We will also propose a multi-level deep learning perception framework that aims to emulate a human learning experience.
arXiv Detail & Related papers (2022-07-01T03:42:33Z) - How Do We Fail? Stress Testing Perception in Autonomous Vehicles [40.19326157052966]
This paper presents a method for characterizing failures of LiDAR-based perception systems for autonomous vehicles in adverse weather conditions.
We develop a methodology based in reinforcement learning to find likely failures in object tracking and trajectory prediction due to sequences of disturbances.
arXiv Detail & Related papers (2022-03-26T20:48:09Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Testing the Safety of Self-driving Vehicles by Simulating Perception and
Prediction [88.0416857308144]
We propose an alternative to sensor simulation, as sensor simulation is expensive and has large domain gaps.
We directly simulate the outputs of the self-driving vehicle's perception and prediction system, enabling realistic motion planning testing.
arXiv Detail & Related papers (2020-08-13T17:20:02Z) - Point Cloud Based Reinforcement Learning for Sim-to-Real and Partial
Observability in Visual Navigation [62.22058066456076]
Reinforcement Learning (RL) represents powerful tools to solve complex robotic tasks.
RL does not work directly in the real-world, which is known as the sim-to-real transfer problem.
We propose a method that learns on an observation space constructed by point clouds and environment randomization.
arXiv Detail & Related papers (2020-07-27T17:46:59Z) - Probabilistic End-to-End Vehicle Navigation in Complex Dynamic
Environments with Multimodal Sensor Fusion [16.018962965273495]
All-day and all-weather navigation is a critical capability for autonomous driving.
We propose a probabilistic driving model with ultiperception capability utilizing the information from the camera, lidar and radar.
The results suggest that our proposed model outperforms baselines and achieves excellent generalization performance in unseen environments.
arXiv Detail & Related papers (2020-05-05T03:48:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.