Large Language Model Guided Knowledge Distillation for Time Series
Anomaly Detection
- URL: http://arxiv.org/abs/2401.15123v1
- Date: Fri, 26 Jan 2024 09:51:07 GMT
- Title: Large Language Model Guided Knowledge Distillation for Time Series
Anomaly Detection
- Authors: Chen Liu, Shibo He, Qihang Zhou, Shizhong Li, Wenchao Meng
- Abstract summary: AnomalyLLM demonstrates state-of-the-art performance on 15 datasets, improving accuracy by at least 14.5% in the UCR dataset.
- Score: 12.585365177675607
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Self-supervised methods have gained prominence in time series anomaly
detection due to the scarcity of available annotations. Nevertheless, they
typically demand extensive training data to acquire a generalizable
representation map, which conflicts with scenarios of a few available samples,
thereby limiting their performance. To overcome the limitation, we propose
\textbf{AnomalyLLM}, a knowledge distillation-based time series anomaly
detection approach where the student network is trained to mimic the features
of the large language model (LLM)-based teacher network that is pretrained on
large-scale datasets. During the testing phase, anomalies are detected when the
discrepancy between the features of the teacher and student networks is large.
To circumvent the student network from learning the teacher network's feature
of anomalous samples, we devise two key strategies. 1) Prototypical signals are
incorporated into the student network to consolidate the normal feature
extraction. 2) We use synthetic anomalies to enlarge the representation gap
between the two networks. AnomalyLLM demonstrates state-of-the-art performance
on 15 datasets, improving accuracy by at least 14.5\% in the UCR dataset.
Related papers
- Structural Teacher-Student Normality Learning for Multi-Class Anomaly
Detection and Localization [17.543208086457234]
We introduce a novel approach known as Structural Teacher-Student Normality Learning (SNL)
We evaluate our proposed approach on two anomaly detection datasets, MVTecAD and VisA.
Our method surpasses the state-of-the-art distillation-based algorithms by a significant margin of 3.9% and 1.5% on MVTecAD and 1.2% and 2.5% on VisA.
arXiv Detail & Related papers (2024-02-27T00:02:24Z) - Dual-Student Knowledge Distillation Networks for Unsupervised Anomaly
Detection [2.06682776181122]
Student-teacher networks (S-T) are favored in unsupervised anomaly detection.
However, vanilla S-T networks are not stable.
We propose a novel dual-student knowledge distillation architecture.
arXiv Detail & Related papers (2024-02-01T09:32:39Z) - Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - Prior Knowledge Guided Network for Video Anomaly Detection [1.389970629097429]
Video Anomaly Detection (VAD) involves detecting anomalous events in videos.
We propose a Prior Knowledge Guided Network(PKG-Net) for the VAD task.
arXiv Detail & Related papers (2023-09-04T15:57:07Z) - Hyperbolic Self-supervised Contrastive Learning Based Network Anomaly
Detection [0.0]
Anomaly detection on the attributed network has recently received increasing attention in many research fields.
We propose an efficient anomaly detection framework using hyperbolic self-supervised contrastive learning.
arXiv Detail & Related papers (2022-09-12T07:08:34Z) - CHALLENGER: Training with Attribution Maps [63.736435657236505]
We show that utilizing attribution maps for training neural networks can improve regularization of models and thus increase performance.
In particular, we show that our generic domain-independent approach yields state-of-the-art results in vision, natural language processing and on time series tasks.
arXiv Detail & Related papers (2022-05-30T13:34:46Z) - Adaptive Memory Networks with Self-supervised Learning for Unsupervised
Anomaly Detection [54.76993389109327]
Unsupervised anomaly detection aims to build models to detect unseen anomalies by only training on the normal data.
We propose a novel approach called Adaptive Memory Network with Self-supervised Learning (AMSL) to address these challenges.
AMSL incorporates a self-supervised learning module to learn general normal patterns and an adaptive memory fusion module to learn rich feature representations.
arXiv Detail & Related papers (2022-01-03T03:40:21Z) - SLA$^2$P: Self-supervised Anomaly Detection with Adversarial
Perturbation [77.71161225100927]
Anomaly detection is a fundamental yet challenging problem in machine learning.
We propose a novel and powerful framework, dubbed as SLA$2$P, for unsupervised anomaly detection.
arXiv Detail & Related papers (2021-11-25T03:53:43Z) - CutPaste: Self-Supervised Learning for Anomaly Detection and
Localization [59.719925639875036]
We propose a framework for building anomaly detectors using normal training data only.
We first learn self-supervised deep representations and then build a generative one-class classifier on learned representations.
Our empirical study on MVTec anomaly detection dataset demonstrates the proposed algorithm is general to be able to detect various types of real-world defects.
arXiv Detail & Related papers (2021-04-08T19:04:55Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.