Energy-based Detection of Adverse Weather Effects in LiDAR Data
- URL: http://arxiv.org/abs/2305.16129v3
- Date: Thu, 29 Jun 2023 10:14:03 GMT
- Title: Energy-based Detection of Adverse Weather Effects in LiDAR Data
- Authors: Aldi Piroli, Vinzenz Dallabetta, Johannes Kopp, Marc Walessa, Daniel
Meissner, Klaus Dietmayer
- Abstract summary: We propose a novel approach for detecting adverse weather effects in LiDAR data.
Our method learns to associate low energy scores with inlier points and high energy scores with outliers.
To help expand the research field of LiDAR perception in adverse weather, we release the SemanticSpray dataset.
- Score: 7.924836086640871
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Autonomous vehicles rely on LiDAR sensors to perceive the environment.
Adverse weather conditions like rain, snow, and fog negatively affect these
sensors, reducing their reliability by introducing unwanted noise in the
measurements. In this work, we tackle this problem by proposing a novel
approach for detecting adverse weather effects in LiDAR data. We reformulate
this problem as an outlier detection task and use an energy-based framework to
detect outliers in point clouds. More specifically, our method learns to
associate low energy scores with inlier points and high energy scores with
outliers allowing for robust detection of adverse weather effects. In extensive
experiments, we show that our method performs better in adverse weather
detection and has higher robustness to unseen weather effects than previous
state-of-the-art methods. Furthermore, we show how our method can be used to
perform simultaneous outlier detection and semantic segmentation. Finally, to
help expand the research field of LiDAR perception in adverse weather, we
release the SemanticSpray dataset, which contains labeled vehicle spray data in
highway-like scenarios. The dataset is available at
https://semantic-spray-dataset.github.io .
Related papers
- Label-Efficient Semantic Segmentation of LiDAR Point Clouds in Adverse Weather Conditions [10.306226508237348]
Adverse weather conditions can severely affect the performance of LiDAR sensors.
Current approaches for detecting adverse weather points require large amounts of labeled data.
This paper proposes a label-efficient approach to segment LiDAR point clouds in adverse weather.
arXiv Detail & Related papers (2024-06-14T10:29:00Z) - OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising [49.86409475232849]
Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
arXiv Detail & Related papers (2024-04-02T18:30:29Z) - Enhancing Lidar-based Object Detection in Adverse Weather using Offset
Sequences in Time [1.1725016312484975]
Lidar-based object detection is significantly affected by adverse weather conditions such as rain and fog.
Our research provides a comprehensive study of effective methods for mitigating the effects of adverse weather on the reliability of lidar-based object detection.
arXiv Detail & Related papers (2024-01-17T08:31:58Z) - Towards Robust 3D Object Detection In Rainy Conditions [10.920640666237833]
We propose a framework for improving the robustness of LiDAR-based 3D object detectors against road spray.
Our approach uses a state-of-the-art adverse weather detection network to filter out spray from the LiDAR point cloud.
In addition to adverse weather filtering, we explore the use of radar targets to further filter false positive detections.
arXiv Detail & Related papers (2023-10-02T07:34:15Z) - Echoes Beyond Points: Unleashing the Power of Raw Radar Data in
Multi-modality Fusion [74.84019379368807]
We propose a novel method named EchoFusion to skip the existing radar signal processing pipeline.
Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors.
arXiv Detail & Related papers (2023-07-31T09:53:50Z) - Multimodal Dataset from Harsh Sub-Terranean Environment with Aerosol
Particles for Frontier Exploration [55.41644538483948]
This paper introduces a multimodal dataset from the harsh and unstructured underground environment with aerosol particles.
It contains synchronized raw data measurements from all onboard sensors in Robot Operating System (ROS) format.
The focus of this paper is not only to capture both temporal and spatial data diversities but also to present the impact of harsh conditions on captured data.
arXiv Detail & Related papers (2023-04-27T20:21:18Z) - Survey on LiDAR Perception in Adverse Weather Conditions [6.317642241067219]
The active LiDAR sensor is able to create an accurate 3D representation of a scene.
The LiDAR's performance change under adverse weather conditions like fog, snow or rain.
We address topics such as the availability of appropriate data, raw point cloud processing and denoising, robust perception algorithms and sensor fusion to mitigate adverse weather induced shortcomings.
arXiv Detail & Related papers (2023-04-13T07:45:23Z) - Fog Simulation on Real LiDAR Point Clouds for 3D Object Detection in
Adverse Weather [92.84066576636914]
This work addresses the challenging task of LiDAR-based 3D object detection in foggy weather.
We tackle this problem by simulating physically accurate fog into clear-weather scenes.
We are the first to provide strong 3D object detection baselines on the Seeing Through Fog dataset.
arXiv Detail & Related papers (2021-08-11T14:37:54Z) - Lidar Light Scattering Augmentation (LISA): Physics-based Simulation of
Adverse Weather Conditions for 3D Object Detection [60.89616629421904]
Lidar-based object detectors are critical parts of the 3D perception pipeline in autonomous navigation systems such as self-driving cars.
They are sensitive to adverse weather conditions such as rain, snow and fog due to reduced signal-to-noise ratio (SNR) and signal-to-background ratio (SBR)
arXiv Detail & Related papers (2021-07-14T21:10:47Z) - Depth Estimation from Monocular Images and Sparse Radar Data [93.70524512061318]
In this paper, we explore the possibility of achieving a more accurate depth estimation by fusing monocular images and Radar points using a deep neural network.
We find that the noise existing in Radar measurements is one of the main key reasons that prevents one from applying the existing fusion methods.
The experiments are conducted on the nuScenes dataset, which is one of the first datasets which features Camera, Radar, and LiDAR recordings in diverse scenes and weather conditions.
arXiv Detail & Related papers (2020-09-30T19:01:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.