Adversarial Infrared Curves: An Attack on Infrared Pedestrian Detectors
in the Physical World
- URL: http://arxiv.org/abs/2312.14217v1
- Date: Thu, 21 Dec 2023 12:21:57 GMT
- Title: Adversarial Infrared Curves: An Attack on Infrared Pedestrian Detectors
in the Physical World
- Authors: Chengyin Hu, Weiwen Shi
- Abstract summary: Existing approaches, like white-box infrared attacks using bulb boards and QR suits, lack realism and stealthiness.
We propose Adversarial Infrared Curves (AdvIC) to bridge these gaps.
Our experiments confirm AdvIC's effectiveness, achieving 94.8% and 67.2% attack success rates for digital and physical attacks, respectively.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep neural network security is a persistent concern, with considerable
research on visible light physical attacks but limited exploration in the
infrared domain. Existing approaches, like white-box infrared attacks using
bulb boards and QR suits, lack realism and stealthiness. Meanwhile, black-box
methods with cold and hot patches often struggle to ensure robustness. To
bridge these gaps, we propose Adversarial Infrared Curves (AdvIC). Using
Particle Swarm Optimization, we optimize two Bezier curves and employ cold
patches in the physical realm to introduce perturbations, creating infrared
curve patterns for physical sample generation. Our extensive experiments
confirm AdvIC's effectiveness, achieving 94.8\% and 67.2\% attack success rates
for digital and physical attacks, respectively. Stealthiness is demonstrated
through a comparative analysis, and robustness assessments reveal AdvIC's
superiority over baseline methods. When deployed against diverse advanced
detectors, AdvIC achieves an average attack success rate of 76.8\%, emphasizing
its robust nature. we explore adversarial defense strategies against AdvIC and
examine its impact under various defense mechanisms. Given AdvIC's substantial
security implications for real-world vision-based applications, urgent
attention and mitigation efforts are warranted.
Related papers
- Multi-View Black-Box Physical Attacks on Infrared Pedestrian Detectors Using Adversarial Infrared Grid [0.0]
Infrared object detectors are vital in modern technological applications but are susceptible to adversarial attacks, posing significant security threats.
Previous studies using physical perturbations like light bulb arrays for white-box attacks, or hot and cold patches for black-box attacks, have proven impractical or limited in multi-view support.
We propose the Adversarial Infrared Grid (AdvGrid), which models perturbations in a grid format and uses a genetic algorithm for black-box optimization.
arXiv Detail & Related papers (2024-07-01T10:38:08Z) - Physical Backdoor: Towards Temperature-based Backdoor Attacks in the Physical World [47.76657100827679]
We introduce two novel types of backdoor attacks on thermal infrared object detection (TIOD)
Key factors influencing trigger design include temperature, size, material, and concealment.
In the digital realm, we evaluate our approach using benchmark datasets for TIOD, achieving an Attack Success Rate (ASR) of up to 98.21%.
arXiv Detail & Related papers (2024-04-30T10:03:26Z) - Adversarial Infrared Geometry: Using Geometry to Perform Adversarial
Attack against Infrared Pedestrian Detectors [0.0]
We propose a novel infrared physical attack termed Adrial Infrared Geometry (textversabfAdvIG)
In digital attack experiments, line, triangle, and ellipse patterns achieve attack success rates of 93.1%, 86.8%, and 100.0%, respectively.
On average, the line, triangle, and ellipse achieve attack success rates of 61.1%, 61.2%, and 96.2%, respectively.
arXiv Detail & Related papers (2024-03-06T12:55:21Z) - Adversarial Infrared Blocks: A Multi-view Black-box Attack to Thermal
Infrared Detectors in Physical World [4.504479592538401]
We propose a novel physical attack called adversarial infrared blocks (AdvIB)
By optimizing the physical parameters of the adversarial infrared blocks, this method can execute a stealthy black-box attack on thermal imaging system from various angles.
For stealthiness, our method involves attaching the adversarial infrared block to the inside of clothing, enhancing its stealthiness.
arXiv Detail & Related papers (2023-04-21T02:53:56Z) - Physically Adversarial Infrared Patches with Learnable Shapes and
Locations [1.1172382217477126]
We propose a physically feasible infrared attack method called "adversarial infrared patches"
Considering the imaging mechanism of infrared cameras by capturing objects' thermal radiation, adversarial infrared patches conduct attacks by attaching a patch of thermal insulation materials on the target object to manipulate its thermal distribution.
We verify adversarial infrared patches in different object detection tasks with various object detectors.
arXiv Detail & Related papers (2023-03-24T09:11:36Z) - Guidance Through Surrogate: Towards a Generic Diagnostic Attack [101.36906370355435]
We develop a guided mechanism to avoid local minima during attack optimization, leading to a novel attack dubbed Guided Projected Gradient Attack (G-PGA)
Our modified attack does not require random restarts, large number of attack iterations or search for an optimal step-size.
More than an effective attack, G-PGA can be used as a diagnostic tool to reveal elusive robustness due to gradient masking in adversarial defenses.
arXiv Detail & Related papers (2022-12-30T18:45:23Z) - HOTCOLD Block: Fooling Thermal Infrared Detectors with a Novel Wearable
Design [60.97064635095259]
textscHotCold Block is a novel physical attack for infrared detectors that hide persons utilizing the wearable Warming Paste and Cooling Paste.
By attaching these readily available temperature-controlled materials to the body, textscHotCold Block evades human eyes efficiently.
arXiv Detail & Related papers (2022-12-12T05:23:11Z) - Physical Adversarial Attack meets Computer Vision: A Decade Survey [57.46379460600939]
This paper presents a comprehensive overview of physical adversarial attacks.
We take the first step to systematically evaluate the performance of physical adversarial attacks.
Our proposed evaluation metric, hiPAA, comprises six perspectives.
arXiv Detail & Related papers (2022-09-30T01:59:53Z) - Adversarial Color Projection: A Projector-based Physical Attack to DNNs [3.9477796725601872]
We propose a black-box projector-based physical attack, referred to as adversarial color projection (AdvCP)
We achieve an attack success rate of 97.60% on a subset of ImageNet, while in the physical environment, we attain an attack success rate of 100%.
When attacking advanced DNNs, experimental results show that our method can achieve more than 85% attack success rate.
arXiv Detail & Related papers (2022-09-19T12:27:32Z) - Shadows can be Dangerous: Stealthy and Effective Physical-world
Adversarial Attack by Natural Phenomenon [79.33449311057088]
We study a new type of optical adversarial examples, in which the perturbations are generated by a very common natural phenomenon, shadow.
We extensively evaluate the effectiveness of this new attack on both simulated and real-world environments.
arXiv Detail & Related papers (2022-03-08T02:40:18Z) - AdvMind: Inferring Adversary Intent of Black-Box Attacks [66.19339307119232]
We present AdvMind, a new class of estimation models that infer the adversary intent of black-box adversarial attacks in a robust manner.
On average AdvMind detects the adversary intent with over 75% accuracy after observing less than 3 query batches.
arXiv Detail & Related papers (2020-06-16T22:04:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.