Adversarial Patch Camouflage against Aerial Detection
- URL: http://arxiv.org/abs/2008.13671v1
- Date: Mon, 31 Aug 2020 15:21:50 GMT
- Title: Adversarial Patch Camouflage against Aerial Detection
- Authors: Ajaya Adhikari, Richard den Hollander, Ioannis Tolios, Michael van
Bekkum, Anneloes Bal, Stijn Hendriks, Maarten Kruithof, Dennis Gross, Nils
Jansen, Guillermo P\'erez, Kit Buurman, Stephan Raaijmakers
- Abstract summary: Detection of military assets on the ground can be performed by applying deep learning-based object detectors on drone surveillance footage.
In this work, we apply patch-based adversarial attacks for the use case of unmanned aerial surveillance.
Our results show that adversarial patch attacks form a realistic alternative to traditional camouflage activities.
- Score: 2.3268622345249796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Detection of military assets on the ground can be performed by applying deep
learning-based object detectors on drone surveillance footage. The traditional
way of hiding military assets from sight is camouflage, for example by using
camouflage nets. However, large assets like planes or vessels are difficult to
conceal by means of traditional camouflage nets. An alternative type of
camouflage is the direct misleading of automatic object detectors. Recently, it
has been observed that small adversarial changes applied to images of the
object can produce erroneous output by deep learning-based detectors. In
particular, adversarial attacks have been successfully demonstrated to prohibit
person detections in images, requiring a patch with a specific pattern held up
in front of the person, thereby essentially camouflaging the person for the
detector. Research into this type of patch attacks is still limited and several
questions related to the optimal patch configuration remain open.
This work makes two contributions. First, we apply patch-based adversarial
attacks for the use case of unmanned aerial surveillance, where the patch is
laid on top of large military assets, camouflaging them from automatic
detectors running over the imagery. The patch can prevent automatic detection
of the whole object while only covering a small part of it. Second, we perform
several experiments with different patch configurations, varying their size,
position, number and saliency. Our results show that adversarial patch attacks
form a realistic alternative to traditional camouflage activities, and should
therefore be considered in the automated analysis of aerial surveillance
imagery.
Related papers
- Adversarial Camera Patch: An Effective and Robust Physical-World Attack
on Object Detectors [0.0]
Researchers are exploring patch-based physical attacks, yet traditional approaches, while effective, often result in conspicuous patches covering target objects.
Recent camera-based physical attacks have emerged, leveraging camera patches to execute stealthy attacks.
We propose an Adversarial Camera Patch (ADCP) to address this issue.
arXiv Detail & Related papers (2023-12-11T06:56:50Z) - CamoFormer: Masked Separable Attention for Camouflaged Object Detection [94.2870722866853]
We present a simple masked separable attention (MSA) for camouflaged object detection.
We first separate the multi-head self-attention into three parts, which are responsible for distinguishing the camouflaged objects from the background using different mask strategies.
We propose to capture high-resolution semantic representations progressively based on a simple top-down decoder with the proposed MSA to attain precise segmentation results.
arXiv Detail & Related papers (2022-12-10T10:03:27Z) - The Weaknesses of Adversarial Camouflage in Overhead Imagery [7.724233098666892]
We build a library of 24 adversarial patches to disguise four different object classes: bus, car, truck, van.
We show that while adversarial patches may fool object detectors, the presence of such patches is often easily uncovered.
This raises the question of whether such patches truly constitute camouflage.
arXiv Detail & Related papers (2022-07-06T20:39:21Z) - Towards Deeper Understanding of Camouflaged Object Detection [64.81987999832032]
We argue that the binary segmentation setting fails to fully understand the concept of camouflage.
We present the first triple-task learning framework to simultaneously localize, segment and rank camouflaged objects.
arXiv Detail & Related papers (2022-05-23T14:26:18Z) - Developing Imperceptible Adversarial Patches to Camouflage Military
Assets From Computer Vision Enabled Technologies [0.0]
Convolutional neural networks (CNNs) have demonstrated rapid progress and a high level of success in object detection.
Recent evidence has highlighted their vulnerability to adversarial attacks.
We present a unique method that produces imperceptible patches capable of camouflaging large military assets.
arXiv Detail & Related papers (2022-02-17T20:31:51Z) - ObjectSeeker: Certifiably Robust Object Detection against Patch Hiding
Attacks via Patch-agnostic Masking [95.6347501381882]
Object detectors are found to be vulnerable to physical-world patch hiding attacks.
We propose ObjectSeeker as a framework for building certifiably robust object detectors.
arXiv Detail & Related papers (2022-02-03T19:34:25Z) - Segment and Complete: Defending Object Detectors against Adversarial
Patch Attacks with Robust Patch Detection [142.24869736769432]
Adversarial patch attacks pose a serious threat to state-of-the-art object detectors.
We propose Segment and Complete defense (SAC), a framework for defending object detectors against patch attacks.
We show SAC can significantly reduce the targeted attack success rate of physical patch attacks.
arXiv Detail & Related papers (2021-12-08T19:18:48Z) - You Cannot Easily Catch Me: A Low-Detectable Adversarial Patch for
Object Detectors [12.946967210071032]
Adversarial patches can fool facial recognition systems, surveillance systems and self-driving cars.
Most existing adversarial patches can be outwitted, disabled and rejected by an adversarial patch detector.
We present a novel approach, a Low-Detectable Adversarial Patch, which attacks an object detector with texture-consistent adversarial patches.
arXiv Detail & Related papers (2021-09-30T14:47:29Z) - The Translucent Patch: A Physical and Universal Attack on Object
Detectors [48.31712758860241]
We propose a contactless physical patch to fool state-of-the-art object detectors.
The primary goal of our patch is to hide all instances of a selected target class.
We show that our patch was able to prevent the detection of 42.27% of all stop sign instances.
arXiv Detail & Related papers (2020-12-23T07:47:13Z) - Dynamic Adversarial Patch for Evading Object Detection Models [47.32228513808444]
We present an innovative attack method against object detectors applied in a real-world setup.
Our method uses dynamic adversarial patches which are placed at multiple predetermined locations on a target object.
We improved the attack by generating patches that consider the semantic distance between the target object and its classification.
arXiv Detail & Related papers (2020-10-25T08:55:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.