Object Removal Attacks on LiDAR-based 3D Object Detectors
- URL: http://arxiv.org/abs/2102.03722v1
- Date: Sun, 7 Feb 2021 05:34:14 GMT
- Title: Object Removal Attacks on LiDAR-based 3D Object Detectors
- Authors: Zhongyuan Hau, Kenneth T. Co, Soteris Demetriou, Emil C. Lupu
- Abstract summary: Object Removal Attacks (ORAs) aim to force 3D object detectors to fail.
We leverage the default setting of LiDARs that record a single return signal per direction to perturb point clouds in the region of interest.
Our results show that the attack is effective in degrading the performance of commonly used 3D object detection models.
- Score: 6.263478017242508
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: LiDARs play a critical role in Autonomous Vehicles' (AVs) perception and
their safe operations. Recent works have demonstrated that it is possible to
spoof LiDAR return signals to elicit fake objects. In this work we demonstrate
how the same physical capabilities can be used to mount a new, even more
dangerous class of attacks, namely Object Removal Attacks (ORAs). ORAs aim to
force 3D object detectors to fail. We leverage the default setting of LiDARs
that record a single return signal per direction to perturb point clouds in the
region of interest (RoI) of 3D objects. By injecting illegitimate points behind
the target object, we effectively shift points away from the target objects'
RoIs. Our initial results using a simple random point selection strategy show
that the attack is effective in degrading the performance of commonly used 3D
object detection models.
Related papers
- Transient Adversarial 3D Projection Attacks on Object Detection in Autonomous Driving [15.516055760190884]
We introduce an adversarial 3D projection attack specifically targeting object detection in autonomous driving scenarios.
Our results demonstrate the effectiveness of the proposed attack in deceiving YOLOv3 and Mask R-CNN in physical settings.
arXiv Detail & Related papers (2024-09-25T22:27:11Z) - Revisiting Out-of-Distribution Detection in LiDAR-based 3D Object Detection [12.633311483061647]
Out-of-distribution (OOD) objects can lead to misclassifications, posing a significant risk to the safety and reliability of automated vehicles.
We propose a new evaluation protocol that allows the use of existing datasets without modifying the point cloud.
The effectiveness of our method is validated through experiments on the newly proposed nuScenes OOD benchmark.
arXiv Detail & Related papers (2024-04-24T13:48:38Z) - A Comprehensive Study of the Robustness for LiDAR-based 3D Object
Detectors against Adversarial Attacks [84.10546708708554]
3D object detectors are increasingly crucial for security-critical tasks.
It is imperative to understand their robustness against adversarial attacks.
This paper presents the first comprehensive evaluation and analysis of the robustness of LiDAR-based 3D detectors under adversarial attacks.
arXiv Detail & Related papers (2022-12-20T13:09:58Z) - Object-fabrication Targeted Attack for Object Detection [54.10697546734503]
adversarial attack for object detection contains targeted attack and untargeted attack.
New object-fabrication targeted attack mode can mislead detectors tofabricate extra false objects with specific target labels.
arXiv Detail & Related papers (2022-12-13T08:42:39Z) - Using 3D Shadows to Detect Object Hiding Attacks on Autonomous Vehicle
Perception [6.371941066890801]
We leverage 3D shadows to locate obstacles that are hidden from object detectors.
Our proposed methodology can be used to detect an object that has been hidden by an adversary as these objects.
We show that using 3D shadows for obstacle detection can achieve high accuracy in matching shadows to their object.
arXiv Detail & Related papers (2022-04-29T09:49:29Z) - Object Manipulation via Visual Target Localization [64.05939029132394]
Training agents to manipulate objects, poses many challenges.
We propose an approach that explores the environment in search for target objects, computes their 3D coordinates once they are located, and then continues to estimate their 3D locations even when the objects are not visible.
Our evaluations show a massive 3x improvement in success rate over a model that has access to the same sensory suite.
arXiv Detail & Related papers (2022-03-15T17:59:01Z) - Embracing Single Stride 3D Object Detector with Sparse Transformer [63.179720817019096]
In LiDAR-based 3D object detection for autonomous driving, the ratio of the object size to input scene size is significantly smaller compared to 2D detection cases.
Many 3D detectors directly follow the common practice of 2D detectors, which downsample the feature maps even after quantizing the point clouds.
We propose Single-stride Sparse Transformer (SST) to maintain the original resolution from the beginning to the end of the network.
arXiv Detail & Related papers (2021-12-13T02:12:02Z) - Temporal Consistency Checks to Detect LiDAR Spoofing Attacks on
Autonomous Vehicle Perception [4.092959254671909]
Recent work has serious LiDAR spoofing attacks with alarming consequences.
In this work, we explore the use of motion as a physical invariant of genuine objects for detecting such attacks.
Preliminary design and implementation of a 3D-TC2 prototype demonstrates very promising performance.
arXiv Detail & Related papers (2021-06-15T01:36:40Z) - Detecting Invisible People [58.49425715635312]
We re-purpose tracking benchmarks and propose new metrics for the task of detecting invisible objects.
We demonstrate that current detection and tracking systems perform dramatically worse on this task.
Second, we build dynamic models that explicitly reason in 3D, making use of observations produced by state-of-the-art monocular depth estimation networks.
arXiv Detail & Related papers (2020-12-15T16:54:45Z) - Physically Realizable Adversarial Examples for LiDAR Object Detection [72.0017682322147]
We present a method to generate universal 3D adversarial objects to fool LiDAR detectors.
In particular, we demonstrate that placing an adversarial object on the rooftop of any target vehicle to hide the vehicle entirely from LiDAR detectors with a success rate of 80%.
This is one step closer towards safer self-driving under unseen conditions from limited training data.
arXiv Detail & Related papers (2020-04-01T16:11:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.