When Model Meets New Normals: Test-time Adaptation for Unsupervised
Time-series Anomaly Detection
- URL: http://arxiv.org/abs/2312.11976v2
- Date: Sun, 21 Jan 2024 04:08:28 GMT
- Title: When Model Meets New Normals: Test-time Adaptation for Unsupervised
Time-series Anomaly Detection
- Authors: Dongmin Kim, Sunghyun Park, Jaegul Choo
- Abstract summary: Time-series anomaly detection deals with the problem of detecting anomalous timesteps by learning normality from the sequence of observations.
This paper highlights the prevalence of the new normal problem in unsupervised time-series anomaly detection studies.
We propose a simple yet effective test-time adaptation strategy based on trend estimation and a self-supervised approach to learning new normalities during inference.
- Score: 45.57544783322024
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Time-series anomaly detection deals with the problem of detecting anomalous
timesteps by learning normality from the sequence of observations. However, the
concept of normality evolves over time, leading to a "new normal problem",
where the distribution of normality can be changed due to the distribution
shifts between training and test data. This paper highlights the prevalence of
the new normal problem in unsupervised time-series anomaly detection studies.
To tackle this issue, we propose a simple yet effective test-time adaptation
strategy based on trend estimation and a self-supervised approach to learning
new normalities during inference. Extensive experiments on real-world
benchmarks demonstrate that incorporating the proposed strategy into the
anomaly detector consistently improves the model's performance compared to the
baselines, leading to robustness to the distribution shifts.
Related papers
- A Review on Self-Supervised Learning for Time Series Anomaly Detection: Recent Advances and Open Challenges [0.7646713951724011]
Time series anomaly detection presents various challenges due to the sequential and dynamic nature of time-dependent data.
Self-supervised techniques for time series have garnered attention as a potential solution to undertake this obstacle.
A taxonomy is proposed to categorize these methods based on their primary characteristics.
arXiv Detail & Related papers (2025-01-25T12:25:31Z) - Enhancing Anomaly Detection Generalization through Knowledge Exposure: The Dual Effects of Augmentation [9.740752855568202]
Anomaly detection involves identifying instances within a dataset that deviates from the norm and occur infrequently.
Current benchmarks tend to favor methods biased towards low diversity in normal data, which does not align with real-world scenarios.
We propose new testing protocols and a novel method called Knowledge Exposure (KE), which integrates external knowledge to comprehend concept dynamics.
arXiv Detail & Related papers (2024-06-15T12:37:36Z) - Learning Multi-Pattern Normalities in the Frequency Domain for Efficient Time Series Anomaly Detection [37.992737349167676]
We propose a multi-normal-pattern accommodated anomaly detection method in the frequency domain for time series anomaly detection.
There are three novel characteristics of it: (i) a pattern extraction mechanism excelling at handling diverse normal patterns with a unified model; (ii) a dualistic convolution mechanism that amplifies short-term anomalies in the time domain and hinders the reconstruction of anomalies in the frequency domain; and (iii) leveraging the sparsity and parallelism of frequency domain to enhance model efficiency.
arXiv Detail & Related papers (2023-11-26T03:31:43Z) - MadSGM: Multivariate Anomaly Detection with Score-based Generative
Models [22.296610226476542]
We present a time-series anomaly detector based on score-based generative models, called MadSGM.
Experiments on five real-world benchmark datasets illustrate that MadSGM achieves the most robust and accurate predictions.
arXiv Detail & Related papers (2023-08-29T07:04:50Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - Spatio-temporal predictive tasks for abnormal event detection in videos [60.02503434201552]
We propose new constrained pretext tasks to learn object level normality patterns.
Our approach consists in learning a mapping between down-scaled visual queries and their corresponding normal appearance and motion characteristics.
Experiments on several benchmark datasets demonstrate the effectiveness of our approach to localize and track anomalies.
arXiv Detail & Related papers (2022-10-27T19:45:12Z) - Calibrated One-class Classification for Unsupervised Time Series Anomaly Detection [27.15951068292889]
This paper proposes one-class classification for anomaly detection.
It realizes contamination-tolerant, anomaly-informed learning of data normality.
Our model achieves substantial improvement over sixteen state-of-the-art contenders.
arXiv Detail & Related papers (2022-07-25T13:43:13Z) - Anomaly Transformer: Time Series Anomaly Detection with Association
Discrepancy [68.86835407617778]
Anomaly Transformer achieves state-of-the-art performance on six unsupervised time series anomaly detection benchmarks.
Anomaly Transformer achieves state-of-the-art performance on six unsupervised time series anomaly detection benchmarks.
arXiv Detail & Related papers (2021-10-06T10:33:55Z) - Explainable Deep Few-shot Anomaly Detection with Deviation Networks [123.46611927225963]
We introduce a novel weakly-supervised anomaly detection framework to train detection models.
The proposed approach learns discriminative normality by leveraging the labeled anomalies and a prior probability.
Our model is substantially more sample-efficient and robust, and performs significantly better than state-of-the-art competing methods in both closed-set and open-set settings.
arXiv Detail & Related papers (2021-08-01T14:33:17Z) - Deep Weakly-supervised Anomaly Detection [118.55172352231381]
Pairwise Relation prediction Network (PReNet) learns pairwise relation features and anomaly scores.
PReNet can detect any seen/unseen abnormalities that fit the learned pairwise abnormal patterns.
Empirical results on 12 real-world datasets show that PReNet significantly outperforms nine competing methods in detecting seen and unseen anomalies.
arXiv Detail & Related papers (2019-10-30T00:40:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.