End-To-End Self-tuning Self-supervised Time Series Anomaly Detection
- URL: http://arxiv.org/abs/2404.02865v1
- Date: Wed, 3 Apr 2024 16:57:26 GMT
- Title: End-To-End Self-tuning Self-supervised Time Series Anomaly Detection
- Authors: Boje Deforce, Meng-Chieh Lee, Bart Baesens, EstefanÃa Serral Asensio, Jaemin Yoo, Leman Akoglu,
- Abstract summary: Time series anomaly detection (TSAD) finds many applications such as monitoring environmental sensors, industry type, patient biomarkers, etc.
A two-fold challenge for TSAD is a versatile and unsupervised model that can detect various different types of time series anomalies.
We introduce TSAP for TSA "on autoPilot", which can (self)tune hyper parameters end-to-end.
- Score: 32.746688248671084
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Time series anomaly detection (TSAD) finds many applications such as monitoring environmental sensors, industry KPIs, patient biomarkers, etc. A two-fold challenge for TSAD is a versatile and unsupervised model that can detect various different types of time series anomalies (spikes, discontinuities, trend shifts, etc.) without any labeled data. Modern neural networks have outstanding ability in modeling complex time series. Self-supervised models in particular tackle unsupervised TSAD by transforming the input via various augmentations to create pseudo anomalies for training. However, their performance is sensitive to the choice of augmentation, which is hard to choose in practice, while there exists no effort in the literature on data augmentation tuning for TSAD without labels. Our work aims to fill this gap. We introduce TSAP for TSA "on autoPilot", which can (self-)tune augmentation hyperparameters end-to-end. It stands on two key components: a differentiable augmentation architecture and an unsupervised validation loss to effectively assess the alignment between augmentation type and anomaly type. Case studies show TSAP's ability to effectively select the (discrete) augmentation type and associated (continuous) hyperparameters. In turn, it outperforms established baselines, including SOTA self-supervised models, on diverse TSAD tasks exhibiting different anomaly types.
Related papers
- GenIAS: Generator for Instantiating Anomalies in time Series [54.959865643340535]
We develop a generative model for time series anomaly detection (TSAD) using a variational autoencoder.
GenIAS is designed to produce diverse and realistic synthetic anomalies for TSAD tasks.
Our experiments demonstrate that GenIAS consistently outperforms seventeen traditional and deep anomaly detection models.
arXiv Detail & Related papers (2025-02-12T10:10:04Z) - An Unsupervised Anomaly Detection in Electricity Consumption Using Reinforcement Learning and Time Series Forest Based Framework [1.0036727981085223]
Anomaly detection plays a crucial role in time series applications, primarily because time series data is employed across real-world scenarios.
Previous research has explored different AD models, making specific assumptions with varying sensitivity toward particular anomalies.
We propose a novel model selection for unsupervised AD using a combination of time series forest (TSF) and reinforcement learning (RL) approaches.
arXiv Detail & Related papers (2024-12-30T19:04:43Z) - See it, Think it, Sorted: Large Multimodal Models are Few-shot Time Series Anomaly Analyzers [23.701716999879636]
Time series anomaly detection (TSAD) is becoming increasingly vital due to the rapid growth of time series data.
We introduce a pioneering framework called the Time Series Anomaly Multimodal Analyzer (TAMA) to enhance both the detection and interpretation of anomalies.
arXiv Detail & Related papers (2024-11-04T10:28:41Z) - Mitigating Shortcut Learning with Diffusion Counterfactuals and Diverse Ensembles [95.49699178874683]
We propose DiffDiv, an ensemble diversification framework exploiting Diffusion Probabilistic Models (DPMs)
We show that DPMs can generate images with novel feature combinations, even when trained on samples displaying correlated input features.
We show that DPM-guided diversification is sufficient to remove dependence on shortcut cues, without a need for additional supervised signals.
arXiv Detail & Related papers (2023-11-23T15:47:33Z) - Unraveling the "Anomaly" in Time Series Anomaly Detection: A
Self-supervised Tri-domain Solution [89.16750999704969]
Anomaly labels hinder traditional supervised models in time series anomaly detection.
Various SOTA deep learning techniques, such as self-supervised learning, have been introduced to tackle this issue.
We propose a novel self-supervised learning based Tri-domain Anomaly Detector (TriAD)
arXiv Detail & Related papers (2023-11-19T05:37:18Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - End-to-End Augmentation Hyperparameter Tuning for Self-Supervised
Anomaly Detection [21.97856757574274]
We introduce ST-SSAD (Self-Tuning Self-Supervised Anomaly Detection), the first systematic approach to tuning augmentation.
We show that tuning augmentation offers significant performance gains over current practices.
arXiv Detail & Related papers (2023-06-21T05:48:51Z) - DeepFIB: Self-Imputation for Time Series Anomaly Detection [5.4921159672644775]
Time series anomaly detection (AD) plays an essential role in various applications, e.g., fraud detection in finance and healthcare monitoring.
We propose a novel self-supervised learning technique for AD in time series, namely emphDeepFIB.
We show that DeepFIB outperforms state-of-the-art methods by a large margin, achieving up to $65.2%$ relative improvement in F1-score.
arXiv Detail & Related papers (2021-12-12T14:28:06Z) - DAE : Discriminatory Auto-Encoder for multivariate time-series anomaly
detection in air transportation [68.8204255655161]
We propose a novel anomaly detection model called Discriminatory Auto-Encoder (DAE)
It uses the baseline of a regular LSTM-based auto-encoder but with several decoders, each getting data of a specific flight phase.
Results show that the DAE achieves better results in both accuracy and speed of detection.
arXiv Detail & Related papers (2021-09-08T14:07:55Z) - SUOD: Accelerating Large-Scale Unsupervised Heterogeneous Outlier
Detection [63.253850875265115]
Outlier detection (OD) is a key machine learning (ML) task for identifying abnormal objects from general samples.
We propose a modular acceleration system, called SUOD, to address it.
arXiv Detail & Related papers (2020-03-11T00:22:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.