Evaluating the Robustness of LiDAR-based 3D Obstacles Detection and Its Impacts on Autonomous Driving Systems
- URL: http://arxiv.org/abs/2408.13653v1
- Date: Sat, 24 Aug 2024 19:10:07 GMT
- Title: Evaluating the Robustness of LiDAR-based 3D Obstacles Detection and Its Impacts on Autonomous Driving Systems
- Authors: Tri Minh Triet Pham, Bo Yang, Jinqiu Yang,
- Abstract summary: We study the impact of built-in inaccuracies in LiDAR sensors on LiDAR-3D obstacle detection models.
We apply ET to evaluate the robustness of five classic LiDAR-3D obstacle detection models.
We find that even very subtle changes in point cloud data may introduce a non-trivial decrease in the detection performance.
- Score: 4.530172587010801
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Autonomous driving systems (ADSs) require real-time input from multiple sensors to make time-sensitive decisions using deep neural networks. This makes the correctness of these decisions crucial to ADSs' adoption as errors can cause significant loss. Sensors such as LiDAR are sensitive to environmental changes and built-in inaccuracies and may fluctuate between frames. While there has been extensive work to test ADSs, it remains unclear whether current ADSs are robust against very subtle changes in LiDAR point cloud data. In this work, we study the impact of the built-in inaccuracies in LiDAR sensors on LiDAR-3D obstacle detection models to provide insight into how they can impact obstacle detection (i.e., robustness) and by extension trajectory prediction (i.e., how the robustness of obstacle detection would impact ADSs). We propose a framework SORBET, that applies subtle perturbations to LiDAR data, evaluates the robustness of LiDAR-3D obstacle detection, and assesses the impacts on the trajectory prediction module and ADSs. We applied SORBET to evaluate the robustness of five classic LiDAR-3D obstacle detection models, including one from an industry-grade Level 4 ADS (Baidu's Apollo). Furthermore, we studied how changes in the obstacle detection results would negatively impact trajectory prediction in a cascading fashion. Our evaluation highlights the importance of testing the robustness of LiDAR-3D obstacle detection models against subtle perturbations. We find that even very subtle changes in point cloud data (i.e., removing two points) may introduce a non-trivial decrease in the detection performance. Furthermore, such a negative impact will further propagate to other modules, and endanger the safety of ADSs.
Related papers
- Testing the Fault-Tolerance of Multi-Sensor Fusion Perception in Autonomous Driving Systems [14.871090150807929]
We build fault models for cameras and LiDAR in AVs and inject them into the MSF perception-based ADS to test its behaviors in test scenarios.
We design a feedback-guided differential fuzzer to discover the safety violations of MSF perception-based ADS caused by the injected sensor faults.
arXiv Detail & Related papers (2025-04-18T02:37:55Z) - Hi-ALPS -- An Experimental Robustness Quantification of Six LiDAR-based Object Detection Systems for Autonomous Driving [49.64902130083662]
3D object detection systems (OD) play a key role in the driving decisions of autonomous vehicles.
Adversarial examples are small, sometimes sophisticated perturbations in the input data that change, i.e. falsify, the prediction of the OD.
We quantify the robustness of six state-of-the-art 3D OD under different types of perturbations.
arXiv Detail & Related papers (2025-03-21T14:17:02Z) - Uncertainty Estimation for 3D Object Detection via Evidential Learning [63.61283174146648]
We introduce a framework for quantifying uncertainty in 3D object detection by leveraging an evidential learning loss on Bird's Eye View representations in the 3D detector.
We demonstrate both the efficacy and importance of these uncertainty estimates on identifying out-of-distribution scenes, poorly localized objects, and missing (false negative) detections.
arXiv Detail & Related papers (2024-10-31T13:13:32Z) - Towards Robust 3D Object Detection In Rainy Conditions [10.920640666237833]
We propose a framework for improving the robustness of LiDAR-based 3D object detectors against road spray.
Our approach uses a state-of-the-art adverse weather detection network to filter out spray from the LiDAR point cloud.
In addition to adverse weather filtering, we explore the use of radar targets to further filter false positive detections.
arXiv Detail & Related papers (2023-10-02T07:34:15Z) - Unsupervised Domain Adaptation for Self-Driving from Past Traversal
Features [69.47588461101925]
We propose a method to adapt 3D object detectors to new driving environments.
Our approach enhances LiDAR-based detection models using spatial quantized historical features.
Experiments on real-world datasets demonstrate significant improvements.
arXiv Detail & Related papers (2023-09-21T15:00:31Z) - Ego-Motion Estimation and Dynamic Motion Separation from 3D Point Clouds
for Accumulating Data and Improving 3D Object Detection [0.1474723404975345]
One of high-resolution radar sensors, compared to lidar sensors, is the sparsity of the generated point cloud.
This contribution analyzes limitations of accumulating radar point clouds on the View-of-Delft dataset.
Experiments document an improved object detection performance by applying an ego-motion estimation and dynamic motion correction approach.
arXiv Detail & Related papers (2023-08-29T14:53:16Z) - Context-Aware Change Detection With Semi-Supervised Learning [0.0]
Change detection using earth observation data plays a vital role in quantifying the impact of disasters in affected areas.
Data sources like Sentinel-2 provide rich optical information, but are often hindered by cloud cover.
We develop a model to assess the contribution of pre-disaster Sentinel-2 data in change detection tasks.
arXiv Detail & Related papers (2023-06-15T08:17:49Z) - Survey on LiDAR Perception in Adverse Weather Conditions [6.317642241067219]
The active LiDAR sensor is able to create an accurate 3D representation of a scene.
The LiDAR's performance change under adverse weather conditions like fog, snow or rain.
We address topics such as the availability of appropriate data, raw point cloud processing and denoising, robust perception algorithms and sensor fusion to mitigate adverse weather induced shortcomings.
arXiv Detail & Related papers (2023-04-13T07:45:23Z) - A Comprehensive Study of the Robustness for LiDAR-based 3D Object
Detectors against Adversarial Attacks [84.10546708708554]
3D object detectors are increasingly crucial for security-critical tasks.
It is imperative to understand their robustness against adversarial attacks.
This paper presents the first comprehensive evaluation and analysis of the robustness of LiDAR-based 3D detectors under adversarial attacks.
arXiv Detail & Related papers (2022-12-20T13:09:58Z) - Efficient and Robust LiDAR-Based End-to-End Navigation [132.52661670308606]
We present an efficient and robust LiDAR-based end-to-end navigation framework.
We propose Fast-LiDARNet that is based on sparse convolution kernel optimization and hardware-aware model design.
We then propose Hybrid Evidential Fusion that directly estimates the uncertainty of the prediction from only a single forward pass.
arXiv Detail & Related papers (2021-05-20T17:52:37Z) - Towards robust sensing for Autonomous Vehicles: An adversarial
perspective [82.83630604517249]
It is of primary importance that the resulting decisions are robust to perturbations.
Adversarial perturbations are purposefully crafted alterations of the environment or of the sensory measurements.
A careful evaluation of the vulnerabilities of their sensing system(s) is necessary in order to build and deploy safer systems.
arXiv Detail & Related papers (2020-07-14T05:25:15Z) - Physically Realizable Adversarial Examples for LiDAR Object Detection [72.0017682322147]
We present a method to generate universal 3D adversarial objects to fool LiDAR detectors.
In particular, we demonstrate that placing an adversarial object on the rooftop of any target vehicle to hide the vehicle entirely from LiDAR detectors with a success rate of 80%.
This is one step closer towards safer self-driving under unseen conditions from limited training data.
arXiv Detail & Related papers (2020-04-01T16:11:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.