Adversarial Camera Patch: An Effective and Robust Physical-World Attack
on Object Detectors
- URL: http://arxiv.org/abs/2312.06163v1
- Date: Mon, 11 Dec 2023 06:56:50 GMT
- Title: Adversarial Camera Patch: An Effective and Robust Physical-World Attack
on Object Detectors
- Authors: Kalibinuer Tiliwalidi
- Abstract summary: Researchers are exploring patch-based physical attacks, yet traditional approaches, while effective, often result in conspicuous patches covering target objects.
Recent camera-based physical attacks have emerged, leveraging camera patches to execute stealthy attacks.
We propose an Adversarial Camera Patch (ADCP) to address this issue.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nowadays, the susceptibility of deep neural networks (DNNs) has garnered
significant attention. Researchers are exploring patch-based physical attacks,
yet traditional approaches, while effective, often result in conspicuous
patches covering target objects. This leads to easy detection by human
observers. Recently, novel camera-based physical attacks have emerged,
leveraging camera patches to execute stealthy attacks. These methods circumvent
target object modifications by introducing perturbations directly to the camera
lens, achieving a notable breakthrough in stealthiness. However, prevailing
camera-based strategies necessitate the deployment of multiple patches on the
camera lens, which introduces complexity. To address this issue, we propose an
Adversarial Camera Patch (ADCP).
Related papers
- CBA: Contextual Background Attack against Optical Aerial Detection in
the Physical World [8.826711009649133]
Patch-based physical attacks have increasingly aroused concerns.
Most existing methods focus on obscuring targets captured on the ground, and some of these methods are simply extended to deceive aerial detectors.
We propose Contextual Background Attack (CBA), a novel physical attack framework against aerial detection, which can achieve strong attack efficacy and transferability in the physical world even without smudging the interested objects at all.
arXiv Detail & Related papers (2023-02-27T05:10:27Z) - Threatening Patch Attacks on Object Detection in Optical Remote Sensing
Images [55.09446477517365]
Advanced Patch Attacks (PAs) on object detection in natural images have pointed out the great safety vulnerability in methods based on deep neural networks.
We propose a more Threatening PA without the scarification of the visual quality, dubbed TPA.
To the best of our knowledge, this is the first attempt to study the PAs on object detection in O-RSIs, and we hope this work can get our readers interested in studying this topic.
arXiv Detail & Related papers (2023-02-13T02:35:49Z) - Empirical Evaluation of Physical Adversarial Patch Attacks Against
Overhead Object Detection Models [2.2588953434934416]
Adversarial patches are images designed to fool otherwise well-performing neural network-based computer vision models.
Recent work has demonstrated that these attacks can successfully transfer to the physical world.
We further test the efficacy of adversarial patch attacks in the physical world under more challenging conditions.
arXiv Detail & Related papers (2022-06-25T20:05:11Z) - Developing Imperceptible Adversarial Patches to Camouflage Military
Assets From Computer Vision Enabled Technologies [0.0]
Convolutional neural networks (CNNs) have demonstrated rapid progress and a high level of success in object detection.
Recent evidence has highlighted their vulnerability to adversarial attacks.
We present a unique method that produces imperceptible patches capable of camouflaging large military assets.
arXiv Detail & Related papers (2022-02-17T20:31:51Z) - ObjectSeeker: Certifiably Robust Object Detection against Patch Hiding
Attacks via Patch-agnostic Masking [95.6347501381882]
Object detectors are found to be vulnerable to physical-world patch hiding attacks.
We propose ObjectSeeker as a framework for building certifiably robust object detectors.
arXiv Detail & Related papers (2022-02-03T19:34:25Z) - Segment and Complete: Defending Object Detectors against Adversarial
Patch Attacks with Robust Patch Detection [142.24869736769432]
Adversarial patch attacks pose a serious threat to state-of-the-art object detectors.
We propose Segment and Complete defense (SAC), a framework for defending object detectors against patch attacks.
We show SAC can significantly reduce the targeted attack success rate of physical patch attacks.
arXiv Detail & Related papers (2021-12-08T19:18:48Z) - The Translucent Patch: A Physical and Universal Attack on Object
Detectors [48.31712758860241]
We propose a contactless physical patch to fool state-of-the-art object detectors.
The primary goal of our patch is to hide all instances of a selected target class.
We show that our patch was able to prevent the detection of 42.27% of all stop sign instances.
arXiv Detail & Related papers (2020-12-23T07:47:13Z) - Dynamic Adversarial Patch for Evading Object Detection Models [47.32228513808444]
We present an innovative attack method against object detectors applied in a real-world setup.
Our method uses dynamic adversarial patches which are placed at multiple predetermined locations on a target object.
We improved the attack by generating patches that consider the semantic distance between the target object and its classification.
arXiv Detail & Related papers (2020-10-25T08:55:40Z) - DPAttack: Diffused Patch Attacks against Universal Object Detection [66.026630370248]
Adversarial attacks against object detection can be divided into two categories, whole-pixel attacks and patch attacks.
We propose a diffused patch attack (textbfDPAttack) to fool object detectors by diffused patches of asteroid-shaped or grid-shape.
Experiments show that our DPAttack can successfully fool most object detectors with diffused patches.
arXiv Detail & Related papers (2020-10-16T04:48:24Z) - Adversarial Patch Camouflage against Aerial Detection [2.3268622345249796]
Detection of military assets on the ground can be performed by applying deep learning-based object detectors on drone surveillance footage.
In this work, we apply patch-based adversarial attacks for the use case of unmanned aerial surveillance.
Our results show that adversarial patch attacks form a realistic alternative to traditional camouflage activities.
arXiv Detail & Related papers (2020-08-31T15:21:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.