Temporal Consistency Checks to Detect LiDAR Spoofing Attacks on
Autonomous Vehicle Perception
- URL: http://arxiv.org/abs/2106.07833v1
- Date: Tue, 15 Jun 2021 01:36:40 GMT
- Title: Temporal Consistency Checks to Detect LiDAR Spoofing Attacks on
Autonomous Vehicle Perception
- Authors: Chengzeng You, Zhongyuan Hau, Soteris Demetriou
- Abstract summary: Recent work has serious LiDAR spoofing attacks with alarming consequences.
In this work, we explore the use of motion as a physical invariant of genuine objects for detecting such attacks.
Preliminary design and implementation of a 3D-TC2 prototype demonstrates very promising performance.
- Score: 4.092959254671909
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: LiDAR sensors are used widely in Autonomous Vehicles for better perceiving
the environment which enables safer driving decisions. Recent work has
demonstrated serious LiDAR spoofing attacks with alarming consequences. In
particular, model-level LiDAR spoofing attacks aim to inject fake depth
measurements to elicit ghost objects that are erroneously detected by 3D Object
Detectors, resulting in hazardous driving decisions. In this work, we explore
the use of motion as a physical invariant of genuine objects for detecting such
attacks. Based on this, we propose a general methodology, 3D Temporal
Consistency Check (3D-TC2), which leverages spatio-temporal information from
motion prediction to verify objects detected by 3D Object Detectors. Our
preliminary design and implementation of a 3D-TC2 prototype demonstrates very
promising performance, providing more than 98% attack detection rate with a
recall of 91% for detecting spoofed Vehicle (Car) objects, and is able to
achieve real-time detection at 41Hz
Related papers
- Uncertainty Estimation for 3D Object Detection via Evidential Learning [63.61283174146648]
We introduce a framework for quantifying uncertainty in 3D object detection by leveraging an evidential learning loss on Bird's Eye View representations in the 3D detector.
We demonstrate both the efficacy and importance of these uncertainty estimates on identifying out-of-distribution scenes, poorly localized objects, and missing (false negative) detections.
arXiv Detail & Related papers (2024-10-31T13:13:32Z) - AdvMono3D: Advanced Monocular 3D Object Detection with Depth-Aware
Robust Adversarial Training [64.14759275211115]
We propose a depth-aware robust adversarial training method for monocular 3D object detection, dubbed DART3D.
Our adversarial training approach capitalizes on the inherent uncertainty, enabling the model to significantly improve its robustness against adversarial attacks.
arXiv Detail & Related papers (2023-09-03T07:05:32Z) - FocalFormer3D : Focusing on Hard Instance for 3D Object Detection [97.56185033488168]
False negatives (FN) in 3D object detection can lead to potentially dangerous situations in autonomous driving.
In this work, we propose Hard Instance Probing (HIP), a general pipeline that identifies textitFN in a multi-stage manner.
We instantiate this method as FocalFormer3D, a simple yet effective detector that excels at excavating difficult objects.
arXiv Detail & Related papers (2023-08-08T20:06:12Z) - A Comprehensive Study of the Robustness for LiDAR-based 3D Object
Detectors against Adversarial Attacks [84.10546708708554]
3D object detectors are increasingly crucial for security-critical tasks.
It is imperative to understand their robustness against adversarial attacks.
This paper presents the first comprehensive evaluation and analysis of the robustness of LiDAR-based 3D detectors under adversarial attacks.
arXiv Detail & Related papers (2022-12-20T13:09:58Z) - Spatial-Temporal Anomaly Detection for Sensor Attacks in Autonomous
Vehicles [1.7188280334580195]
Time-of-flight (ToF) distance measurement devices are vulnerable to spoofing, triggering and false data injection attacks.
We propose a spatial-temporal anomaly detection model textitSTAnDS which incorporates a residual error spatial detector, with a time-based expected change detection.
arXiv Detail & Related papers (2022-12-15T12:21:27Z) - Detecting and Identifying Optical Signal Attacks on Autonomous Driving
Systems [25.32946739108013]
We propose a framework to detect and identify sensors that are under attack.
Specifically, we first develop a new technique to detect attacks on a system that consists of three sensors.
In our study, we use real data sets and the state-of-the-art machine learning model to evaluate our attack detection scheme.
arXiv Detail & Related papers (2021-10-20T12:21:04Z) - Fooling LiDAR Perception via Adversarial Trajectory Perturbation [13.337443990751495]
LiDAR point clouds collected from a moving vehicle are functions of its trajectories, because the sensor motion needs to be compensated to avoid distortions.
Could the motion compensation consequently become a wide-open backdoor in those networks, due to both the adversarial vulnerability of deep learning and GPS-based vehicle trajectory estimation?
We demonstrate such possibilities for the first time: instead of directly attacking point cloud coordinates which requires tampering with the raw LiDAR readings, only adversarial spoofing of a self-driving car's trajectory with small perturbations is enough.
arXiv Detail & Related papers (2021-03-29T04:34:31Z) - Exploiting Playbacks in Unsupervised Domain Adaptation for 3D Object
Detection [55.12894776039135]
State-of-the-art 3D object detectors, based on deep learning, have shown promising accuracy but are prone to over-fit to domain idiosyncrasies.
We propose a novel learning approach that drastically reduces this gap by fine-tuning the detector on pseudo-labels in the target domain.
We show, on five autonomous driving datasets, that fine-tuning the detector on these pseudo-labels substantially reduces the domain gap to new driving environments.
arXiv Detail & Related papers (2021-03-26T01:18:11Z) - Object Removal Attacks on LiDAR-based 3D Object Detectors [6.263478017242508]
Object Removal Attacks (ORAs) aim to force 3D object detectors to fail.
We leverage the default setting of LiDARs that record a single return signal per direction to perturb point clouds in the region of interest.
Our results show that the attack is effective in degrading the performance of commonly used 3D object detection models.
arXiv Detail & Related papers (2021-02-07T05:34:14Z) - Physically Realizable Adversarial Examples for LiDAR Object Detection [72.0017682322147]
We present a method to generate universal 3D adversarial objects to fool LiDAR detectors.
In particular, we demonstrate that placing an adversarial object on the rooftop of any target vehicle to hide the vehicle entirely from LiDAR detectors with a success rate of 80%.
This is one step closer towards safer self-driving under unseen conditions from limited training data.
arXiv Detail & Related papers (2020-04-01T16:11:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.