Survey on LiDAR Perception in Adverse Weather Conditions
- URL: http://arxiv.org/abs/2304.06312v2
- Date: Tue, 6 Jun 2023 15:49:58 GMT
- Title: Survey on LiDAR Perception in Adverse Weather Conditions
- Authors: Mariella Dreissig, Dominik Scheuble, Florian Piewak and Joschka
Boedecker
- Abstract summary: The active LiDAR sensor is able to create an accurate 3D representation of a scene.
The LiDAR's performance change under adverse weather conditions like fog, snow or rain.
We address topics such as the availability of appropriate data, raw point cloud processing and denoising, robust perception algorithms and sensor fusion to mitigate adverse weather induced shortcomings.
- Score: 6.317642241067219
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Autonomous vehicles rely on a variety of sensors to gather information about
their surrounding. The vehicle's behavior is planned based on the environment
perception, making its reliability crucial for safety reasons. The active LiDAR
sensor is able to create an accurate 3D representation of a scene, making it a
valuable addition for environment perception for autonomous vehicles. Due to
light scattering and occlusion, the LiDAR's performance change under adverse
weather conditions like fog, snow or rain. This limitation recently fostered a
large body of research on approaches to alleviate the decrease in perception
performance. In this survey, we gathered, analyzed, and discussed different
aspects on dealing with adverse weather conditions in LiDAR-based environment
perception. We address topics such as the availability of appropriate data, raw
point cloud processing and denoising, robust perception algorithms and sensor
fusion to mitigate adverse weather induced shortcomings. We furthermore
identify the most pressing gaps in the current literature and pinpoint
promising research directions.
Related papers
- Exploring Domain Shift on Radar-Based 3D Object Detection Amidst Diverse Environmental Conditions [15.767261586617746]
This study delves into the often-overlooked yet crucial issue of domain shift in 4D radar-based object detection.
Our findings highlight distinct domain shifts across various weather scenarios, revealing unique dataset sensitivities.
transitioning between different road types, especially from highways to urban settings, introduces notable domain shifts.
arXiv Detail & Related papers (2024-08-13T09:55:38Z) - Experimental Evaluation of Road-Crossing Decisions by Autonomous Wheelchairs against Environmental Factors [42.90509901417468]
We focus on the fine-tuning of tracking performance and on its experimental evaluation against outdoor environmental factors.
We show that the approach can be adopted to evaluate video tracking and event detection robustness against outdoor environmental factors.
arXiv Detail & Related papers (2024-05-27T08:43:26Z) - OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising [49.86409475232849]
Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
arXiv Detail & Related papers (2024-04-02T18:30:29Z) - Towards Robust 3D Object Detection In Rainy Conditions [10.920640666237833]
We propose a framework for improving the robustness of LiDAR-based 3D object detectors against road spray.
Our approach uses a state-of-the-art adverse weather detection network to filter out spray from the LiDAR point cloud.
In addition to adverse weather filtering, we explore the use of radar targets to further filter false positive detections.
arXiv Detail & Related papers (2023-10-02T07:34:15Z) - Energy-based Detection of Adverse Weather Effects in LiDAR Data [7.924836086640871]
We propose a novel approach for detecting adverse weather effects in LiDAR data.
Our method learns to associate low energy scores with inlier points and high energy scores with outliers.
To help expand the research field of LiDAR perception in adverse weather, we release the SemanticSpray dataset.
arXiv Detail & Related papers (2023-05-25T15:03:36Z) - Camera-Radar Perception for Autonomous Vehicles and ADAS: Concepts,
Datasets and Metrics [77.34726150561087]
This work aims to carry out a study on the current scenario of camera and radar-based perception for ADAS and autonomous vehicles.
Concepts and characteristics related to both sensors, as well as to their fusion, are presented.
We give an overview of the Deep Learning-based detection and segmentation tasks, and the main datasets, metrics, challenges, and open questions in vehicle perception.
arXiv Detail & Related papers (2023-03-08T00:48:32Z) - Vision in adverse weather: Augmentation using CycleGANs with various
object detectors for robust perception in autonomous racing [70.16043883381677]
In autonomous racing, the weather can change abruptly, causing significant degradation in perception, resulting in ineffective manoeuvres.
In order to improve detection in adverse weather, deep-learning-based models typically require extensive datasets captured in such conditions.
We introduce an approach of using synthesised adverse condition datasets in autonomous racing (generated using CycleGAN) to improve the performance of four out of five state-of-the-art detectors.
arXiv Detail & Related papers (2022-01-10T10:02:40Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Towards robust sensing for Autonomous Vehicles: An adversarial
perspective [82.83630604517249]
It is of primary importance that the resulting decisions are robust to perturbations.
Adversarial perturbations are purposefully crafted alterations of the environment or of the sensory measurements.
A careful evaluation of the vulnerabilities of their sensing system(s) is necessary in order to build and deploy safer systems.
arXiv Detail & Related papers (2020-07-14T05:25:15Z) - LIBRE: The Multiple 3D LiDAR Dataset [54.25307983677663]
We present LIBRE: LiDAR Benchmarking and Reference, a first-of-its-kind dataset featuring 10 different LiDAR sensors.
LIBRE will contribute to the research community to provide a means for a fair comparison of currently available LiDARs.
It will also facilitate the improvement of existing self-driving vehicles and robotics-related software.
arXiv Detail & Related papers (2020-03-13T06:17:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.