Backdoor Attack against One-Class Sequential Anomaly Detection Models
- URL: http://arxiv.org/abs/2402.10283v1
- Date: Thu, 15 Feb 2024 19:19:54 GMT
- Title: Backdoor Attack against One-Class Sequential Anomaly Detection Models
- Authors: He Cheng and Shuhan Yuan
- Abstract summary: We explore compromising deep sequential anomaly detection models by proposing a novel backdoor attack strategy.
The attack approach comprises two primary steps, trigger generation and backdoor injection.
Experiments demonstrate the effectiveness of our proposed attack strategy by injecting backdoors on two well-established one-class anomaly detection models.
- Score: 10.020488631167204
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep anomaly detection on sequential data has garnered significant attention
due to the wide application scenarios. However, deep learning-based models face
a critical security threat - their vulnerability to backdoor attacks. In this
paper, we explore compromising deep sequential anomaly detection models by
proposing a novel backdoor attack strategy. The attack approach comprises two
primary steps, trigger generation and backdoor injection. Trigger generation is
to derive imperceptible triggers by crafting perturbed samples from the benign
normal data, of which the perturbed samples are still normal. The backdoor
injection is to properly inject the backdoor triggers to comprise the model
only for the samples with triggers. The experimental results demonstrate the
effectiveness of our proposed attack strategy by injecting backdoors on two
well-established one-class anomaly detection models.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.