Real-Time Environment Condition Classification for Autonomous Vehicles
- URL: http://arxiv.org/abs/2405.19305v1
- Date: Wed, 29 May 2024 17:29:55 GMT
- Title: Real-Time Environment Condition Classification for Autonomous Vehicles
- Authors: Marco Introvigne, Andrea Ramazzina, Stefanie Walz, Dominik Scheuble, Mario Bijelic,
- Abstract summary: We train a deep learning model to identify outdoor weather and dangerous road conditions.
We achieve this by introducing an improved taxonomy and label hierarchy for a state-of-the-art adverse-weather dataset.
We train RECNet, a deep learning model for the classification of environment conditions from a single RGB frame.
- Score: 3.8514288339458718
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current autonomous driving technologies are being rolled out in geo-fenced areas with well-defined operation conditions such as time of operation, area, weather conditions and road conditions. In this way, challenging conditions as adverse weather, slippery road or densely-populated city centers can be excluded. In order to lift the geo-fenced restriction and allow a more dynamic availability of autonomous driving functions, it is necessary for the vehicle to autonomously perform an environment condition assessment in real time to identify when the system cannot operate safely and either stop operation or require the resting passenger to take control. In particular, adverse-weather challenges are a fundamental limitation as sensor performance degenerates quickly, prohibiting the use of sensors such as cameras to locate and monitor road signs, pedestrians or other vehicles. To address this issue, we train a deep learning model to identify outdoor weather and dangerous road conditions, enabling a quick reaction to new situations and environments. We achieve this by introducing an improved taxonomy and label hierarchy for a state-of-the-art adverse-weather dataset, relabelling it with a novel semi-automated labeling pipeline. Using the novel proposed dataset and hierarchy, we train RECNet, a deep learning model for the classification of environment conditions from a single RGB frame. We outperform baseline models by relative 16% in F1- Score, while maintaining a real-time capable performance of 20 Hz.
Related papers
- TADAP: Trajectory-Aided Drivable area Auto-labeling with Pre-trained
self-supervised features in winter driving conditions [1.4993021283916008]
Trajectory-Aided Drivable area Auto-labeling with Pre-trained self-supervised features (TADAP) is presented.
A prediction model trained with the TADAP labels achieved a +9.6 improvement in intersection over union.
arXiv Detail & Related papers (2023-12-20T11:51:49Z) - Automated Automotive Radar Calibration With Intelligent Vehicles [73.15674960230625]
We present an approach for automated and geo-referenced calibration of automotive radar sensors.
Our method does not require external modifications of a vehicle and instead uses the location data obtained from automated vehicles.
Our evaluation on data from a real testing site shows that our method can correctly calibrate infrastructure sensors in an automated manner.
arXiv Detail & Related papers (2023-06-23T07:01:10Z) - SHIFT: A Synthetic Driving Dataset for Continuous Multi-Task Domain
Adaptation [152.60469768559878]
SHIFT is the largest multi-task synthetic dataset for autonomous driving.
It presents discrete and continuous shifts in cloudiness, rain and fog intensity, time of day, and vehicle and pedestrian density.
Our dataset and benchmark toolkit are publicly available at www.vis.xyz/shift.
arXiv Detail & Related papers (2022-06-16T17:59:52Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - Learning High-Speed Flight in the Wild [101.33104268902208]
We propose an end-to-end approach that can autonomously fly quadrotors through complex natural and man-made environments at high speeds.
The key principle is to directly map noisy sensory observations to collision-free trajectories in a receding-horizon fashion.
By simulating realistic sensor noise, our approach achieves zero-shot transfer from simulation to challenging real-world environments.
arXiv Detail & Related papers (2021-10-11T09:43:11Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Robustness of Object Detectors in Degrading Weather Conditions [7.91378990016322]
State-of-the-art object detection systems for autonomous driving achieve promising results in clear weather conditions.
These systems need to work in degrading weather conditions, such as rain, fog and snow.
Most approaches evaluate only on the KITTI dataset, which consists only of clear weather scenes.
arXiv Detail & Related papers (2021-06-16T13:56:07Z) - Worsening Perception: Real-time Degradation of Autonomous Vehicle
Perception Performance for Simulation of Adverse Weather Conditions [47.529411576737644]
This study explores the potential of using a simple, lightweight image augmentation system in an autonomous racing vehicle.
With minimal adjustment, the prototype system can replicate the effects of both water droplets on the camera lens, and fading light conditions.
arXiv Detail & Related papers (2021-03-03T23:49:02Z) - Autonomous Off-road Navigation over Extreme Terrains with
Perceptually-challenging Conditions [7.514178230130502]
We propose a framework for resilient autonomous computation in perceptually challenging environments with mobility-stressing elements.
We propose a fast settling algorithm to generate robust multi-fidelity traversability estimates in real-time.
The proposed approach was deployed on multiple physical systems including skid-steer and tracked robots, a high-speed RC car and legged robots.
arXiv Detail & Related papers (2021-01-26T22:13:01Z) - Probabilistic End-to-End Vehicle Navigation in Complex Dynamic
Environments with Multimodal Sensor Fusion [16.018962965273495]
All-day and all-weather navigation is a critical capability for autonomous driving.
We propose a probabilistic driving model with ultiperception capability utilizing the information from the camera, lidar and radar.
The results suggest that our proposed model outperforms baselines and achieves excellent generalization performance in unseen environments.
arXiv Detail & Related papers (2020-05-05T03:48:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.