The Translucent Patch: A Physical and Universal Attack on Object
Detectors
- URL: http://arxiv.org/abs/2012.12528v1
- Date: Wed, 23 Dec 2020 07:47:13 GMT
- Title: The Translucent Patch: A Physical and Universal Attack on Object
Detectors
- Authors: Alon Zolfi and Moshe Kravchik and Yuval Elovici and Asaf Shabtai
- Abstract summary: We propose a contactless physical patch to fool state-of-the-art object detectors.
The primary goal of our patch is to hide all instances of a selected target class.
We show that our patch was able to prevent the detection of 42.27% of all stop sign instances.
- Score: 48.31712758860241
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physical adversarial attacks against object detectors have seen increasing
success in recent years. However, these attacks require direct access to the
object of interest in order to apply a physical patch. Furthermore, to hide
multiple objects, an adversarial patch must be applied to each object. In this
paper, we propose a contactless translucent physical patch containing a
carefully constructed pattern, which is placed on the camera's lens, to fool
state-of-the-art object detectors. The primary goal of our patch is to hide all
instances of a selected target class. In addition, the optimization method used
to construct the patch aims to ensure that the detection of other (untargeted)
classes remains unharmed. Therefore, in our experiments, which are conducted on
state-of-the-art object detection models used in autonomous driving, we study
the effect of the patch on the detection of both the selected target class and
the other classes. We show that our patch was able to prevent the detection of
42.27% of all stop sign instances while maintaining high (nearly 80%) detection
of the other classes.
Related papers
- Attacking Object Detector Using A Universal Targeted Label-Switch Patch [44.44676276867374]
Adversarial attacks against deep learning-based object detectors (ODs) have been studied extensively in the past few years.
None of prior research proposed a misclassification attack on ODs, in which the patch is applied on the target object.
We propose a novel, universal, targeted, label-switch attack against the state-of-the-art object detector, YOLO.
arXiv Detail & Related papers (2022-11-16T12:08:58Z) - ObjectSeeker: Certifiably Robust Object Detection against Patch Hiding
Attacks via Patch-agnostic Masking [95.6347501381882]
Object detectors are found to be vulnerable to physical-world patch hiding attacks.
We propose ObjectSeeker as a framework for building certifiably robust object detectors.
arXiv Detail & Related papers (2022-02-03T19:34:25Z) - Segment and Complete: Defending Object Detectors against Adversarial
Patch Attacks with Robust Patch Detection [142.24869736769432]
Adversarial patch attacks pose a serious threat to state-of-the-art object detectors.
We propose Segment and Complete defense (SAC), a framework for defending object detectors against patch attacks.
We show SAC can significantly reduce the targeted attack success rate of physical patch attacks.
arXiv Detail & Related papers (2021-12-08T19:18:48Z) - Context-Aware Transfer Attacks for Object Detection [51.65308857232767]
We present a new approach to generate context-aware attacks for object detectors.
We show that by using co-occurrence of objects and their relative locations and sizes as context information, we can successfully generate targeted mis-categorization attacks.
arXiv Detail & Related papers (2021-12-06T18:26:39Z) - Dynamic Adversarial Patch for Evading Object Detection Models [47.32228513808444]
We present an innovative attack method against object detectors applied in a real-world setup.
Our method uses dynamic adversarial patches which are placed at multiple predetermined locations on a target object.
We improved the attack by generating patches that consider the semantic distance between the target object and its classification.
arXiv Detail & Related papers (2020-10-25T08:55:40Z) - Adversarial Patch Camouflage against Aerial Detection [2.3268622345249796]
Detection of military assets on the ground can be performed by applying deep learning-based object detectors on drone surveillance footage.
In this work, we apply patch-based adversarial attacks for the use case of unmanned aerial surveillance.
Our results show that adversarial patch attacks form a realistic alternative to traditional camouflage activities.
arXiv Detail & Related papers (2020-08-31T15:21:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.