Physical Backdoor: Towards Temperature-based Backdoor Attacks in the Physical World
- URL: http://arxiv.org/abs/2404.19417v1
- Date: Tue, 30 Apr 2024 10:03:26 GMT
- Title: Physical Backdoor: Towards Temperature-based Backdoor Attacks in the Physical World
- Authors: Wen Yin, Jian Lou, Pan Zhou, Yulai Xie, Dan Feng, Yuhua Sun, Tailai Zhang, Lichao Sun,
- Abstract summary: We introduce two novel types of backdoor attacks on thermal infrared object detection (TIOD)
Key factors influencing trigger design include temperature, size, material, and concealment.
In the digital realm, we evaluate our approach using benchmark datasets for TIOD, achieving an Attack Success Rate (ASR) of up to 98.21%.
- Score: 47.76657100827679
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Backdoor attacks have been well-studied in visible light object detection (VLOD) in recent years. However, VLOD can not effectively work in dark and temperature-sensitive scenarios. Instead, thermal infrared object detection (TIOD) is the most accessible and practical in such environments. In this paper, our team is the first to investigate the security vulnerabilities associated with TIOD in the context of backdoor attacks, spanning both the digital and physical realms. We introduce two novel types of backdoor attacks on TIOD, each offering unique capabilities: Object-affecting Attack and Range-affecting Attack. We conduct a comprehensive analysis of key factors influencing trigger design, which include temperature, size, material, and concealment. These factors, especially temperature, significantly impact the efficacy of backdoor attacks on TIOD. A thorough understanding of these factors will serve as a foundation for designing physical triggers and temperature controlling experiments. Our study includes extensive experiments conducted in both digital and physical environments. In the digital realm, we evaluate our approach using benchmark datasets for TIOD, achieving an Attack Success Rate (ASR) of up to 98.21%. In the physical realm, we test our approach in two real-world settings: a traffic intersection and a parking lot, using a thermal infrared camera. Here, we attain an ASR of up to 98.38%.
Related papers
- Long-Tailed Backdoor Attack Using Dynamic Data Augmentation Operations [50.1394620328318]
Existing backdoor attacks mainly focus on balanced datasets.
We propose an effective backdoor attack named Dynamic Data Augmentation Operation (D$2$AO)
Our method can achieve the state-of-the-art attack performance while preserving the clean accuracy.
arXiv Detail & Related papers (2024-10-16T18:44:22Z) - DiffPhysBA: Diffusion-based Physical Backdoor Attack against Person Re-Identification in Real-World [37.766746270067834]
Person Re-Identification (ReID) systems pose a significant security risk from backdoor attacks, allowing adversaries to evade tracking or impersonate others.
This paper investigates how backdoor attacks can be deployed in real-world scenarios, where a ReID model is typically trained on data collected in the digital domain and then deployed in a physical environment.
We introduce a novel diffusion-based physical backdoor attack (DiffPhysBA) method that adopts a training-free similarity-guided sampling process to enhance the resemblance between generated and physical triggers.
arXiv Detail & Related papers (2024-05-30T12:22:06Z) - Adversarial Infrared Geometry: Using Geometry to Perform Adversarial
Attack against Infrared Pedestrian Detectors [0.0]
We propose a novel infrared physical attack termed Adrial Infrared Geometry (textversabfAdvIG)
In digital attack experiments, line, triangle, and ellipse patterns achieve attack success rates of 93.1%, 86.8%, and 100.0%, respectively.
On average, the line, triangle, and ellipse achieve attack success rates of 61.1%, 61.2%, and 96.2%, respectively.
arXiv Detail & Related papers (2024-03-06T12:55:21Z) - Adversarial Infrared Curves: An Attack on Infrared Pedestrian Detectors
in the Physical World [0.0]
Existing approaches, like white-box infrared attacks using bulb boards and QR suits, lack realism and stealthiness.
We propose Adversarial Infrared Curves (AdvIC) to bridge these gaps.
Our experiments confirm AdvIC's effectiveness, achieving 94.8% and 67.2% attack success rates for digital and physical attacks, respectively.
arXiv Detail & Related papers (2023-12-21T12:21:57Z) - Adversarial Infrared Blocks: A Multi-view Black-box Attack to Thermal
Infrared Detectors in Physical World [4.504479592538401]
We propose a novel physical attack called adversarial infrared blocks (AdvIB)
By optimizing the physical parameters of the adversarial infrared blocks, this method can execute a stealthy black-box attack on thermal imaging system from various angles.
For stealthiness, our method involves attaching the adversarial infrared block to the inside of clothing, enhancing its stealthiness.
arXiv Detail & Related papers (2023-04-21T02:53:56Z) - Physically Adversarial Infrared Patches with Learnable Shapes and
Locations [1.1172382217477126]
We propose a physically feasible infrared attack method called "adversarial infrared patches"
Considering the imaging mechanism of infrared cameras by capturing objects' thermal radiation, adversarial infrared patches conduct attacks by attaching a patch of thermal insulation materials on the target object to manipulate its thermal distribution.
We verify adversarial infrared patches in different object detection tasks with various object detectors.
arXiv Detail & Related papers (2023-03-24T09:11:36Z) - HOTCOLD Block: Fooling Thermal Infrared Detectors with a Novel Wearable
Design [60.97064635095259]
textscHotCold Block is a novel physical attack for infrared detectors that hide persons utilizing the wearable Warming Paste and Cooling Paste.
By attaching these readily available temperature-controlled materials to the body, textscHotCold Block evades human eyes efficiently.
arXiv Detail & Related papers (2022-12-12T05:23:11Z) - Untargeted Backdoor Attack against Object Detection [69.63097724439886]
We design a poison-only backdoor attack in an untargeted manner, based on task characteristics.
We show that, once the backdoor is embedded into the target model by our attack, it can trick the model to lose detection of any object stamped with our trigger patterns.
arXiv Detail & Related papers (2022-11-02T17:05:45Z) - Physical Adversarial Attack meets Computer Vision: A Decade Survey [57.46379460600939]
This paper presents a comprehensive overview of physical adversarial attacks.
We take the first step to systematically evaluate the performance of physical adversarial attacks.
Our proposed evaluation metric, hiPAA, comprises six perspectives.
arXiv Detail & Related papers (2022-09-30T01:59:53Z) - Measurement-driven Security Analysis of Imperceptible Impersonation
Attacks [54.727945432381716]
We study the exploitability of Deep Neural Network-based Face Recognition systems.
We show that factors such as skin color, gender, and age, impact the ability to carry out an attack on a specific target victim.
We also study the feasibility of constructing universal attacks that are robust to different poses or views of the attacker's face.
arXiv Detail & Related papers (2020-08-26T19:27:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.