Backdoor Attacks Against Deep Image Compression via Adaptive Frequency
Trigger
- URL: http://arxiv.org/abs/2302.14677v1
- Date: Tue, 28 Feb 2023 15:39:31 GMT
- Title: Backdoor Attacks Against Deep Image Compression via Adaptive Frequency
Trigger
- Authors: Yi Yu, Yufei Wang, Wenhan Yang, Shijian Lu, Yap-peng Tan, Alex C. Kot
- Abstract summary: We present a novel backdoor attack with multiple triggers against learned image compression models.
Motivated by the widely used discrete cosine transform (DCT) in existing compression systems and standards, we propose a frequency-based trigger injection model.
- Score: 106.10954454667757
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent deep-learning-based compression methods have achieved superior
performance compared with traditional approaches. However, deep learning models
have proven to be vulnerable to backdoor attacks, where some specific trigger
patterns added to the input can lead to malicious behavior of the models. In
this paper, we present a novel backdoor attack with multiple triggers against
learned image compression models. Motivated by the widely used discrete cosine
transform (DCT) in existing compression systems and standards, we propose a
frequency-based trigger injection model that adds triggers in the DCT domain.
In particular, we design several attack objectives for various attacking
scenarios, including: 1) attacking compression quality in terms of bit-rate and
reconstruction quality; 2) attacking task-driven measures, such as down-stream
face recognition and semantic segmentation. Moreover, a novel simple dynamic
loss is designed to balance the influence of different loss terms adaptively,
which helps achieve more efficient training. Extensive experiments show that
with our trained trigger injection models and simple modification of encoder
parameters (of the compression model), the proposed attack can successfully
inject several backdoors with corresponding triggers in a single image
compression model.
Related papers
- A Training-Free Defense Framework for Robust Learned Image Compression [48.41990144764295]
We study the robustness of learned image compression models against adversarial attacks.
We present a training-free defense technique based on simple image transform functions.
arXiv Detail & Related papers (2024-01-22T12:50:21Z) - Activations and Gradients Compression for Model-Parallel Training [85.99744701008802]
We study how simultaneous compression of activations and gradients in model-parallel distributed training setup affects convergence.
We find that gradients require milder compression rates than activations.
Experiments also show that models trained with TopK perform well only when compression is also applied during inference.
arXiv Detail & Related papers (2024-01-15T15:54:54Z) - Transferable Learned Image Compression-Resistant Adversarial Perturbations [66.46470251521947]
Adversarial attacks can readily disrupt the image classification system, revealing the vulnerability of DNN-based recognition tasks.
We introduce a new pipeline that targets image classification models that utilize learned image compressors as pre-processing modules.
arXiv Detail & Related papers (2024-01-06T03:03:28Z) - High-Fidelity Variable-Rate Image Compression via Invertible Activation
Transformation [24.379052026260034]
We propose the Invertible Activation Transformation (IAT) module to tackle the issue of high-fidelity fine variable-rate image compression.
IAT and QLevel together give the image compression model the ability of fine variable-rate control while better maintaining the image fidelity.
Our method outperforms the state-of-the-art variable-rate image compression method by a large margin, especially after multiple re-encodings.
arXiv Detail & Related papers (2022-09-12T07:14:07Z) - Variable-Rate Deep Image Compression through Spatially-Adaptive Feature
Transform [58.60004238261117]
We propose a versatile deep image compression network based on Spatial Feature Transform (SFT arXiv:1804.02815)
Our model covers a wide range of compression rates using a single model, which is controlled by arbitrary pixel-wise quality maps.
The proposed framework allows us to perform task-aware image compressions for various tasks.
arXiv Detail & Related papers (2021-08-21T17:30:06Z) - Countering Adversarial Examples: Combining Input Transformation and
Noisy Training [15.561916630351947]
adversarial examples pose a threat to security-sensitive image recognition task.
Traditional JPEG compression is insufficient to defend those attacks but can cause an abrupt accuracy decline to benign images.
We make modifications to traditional JPEG compression algorithm which becomes more favorable for NN.
arXiv Detail & Related papers (2021-06-25T02:46:52Z) - Substitutional Neural Image Compression [48.20906717052056]
Substitutional Neural Image Compression (SNIC) is a general approach for enhancing any neural image compression model.
It boosts compression performance toward a flexible distortion metric and enables bit-rate control using a single model instance.
arXiv Detail & Related papers (2021-05-16T20:53:31Z) - Robustness and Transferability of Universal Attacks on Compressed Models [3.187381965457262]
Neural network compression methods like pruning and quantization are very effective at efficiently deploying Deep Neural Networks (DNNs) on edge devices.
In particular, Universal Adversarial Perturbations (UAPs), are a powerful class of adversarial attacks.
We show that, in some scenarios, quantization can produce gradient-masking, giving a false sense of security.
arXiv Detail & Related papers (2020-12-10T23:40:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.