Harnessing Contrastive Learning and Neural Transformation for Time Series Anomaly Detection
- URL: http://arxiv.org/abs/2304.07898v2
- Date: Sat, 25 Jan 2025 01:30:47 GMT
- Title: Harnessing Contrastive Learning and Neural Transformation for Time Series Anomaly Detection
- Authors: Katrina Chen, Mingbin Feng, Tony S. Wirjanto,
- Abstract summary: Time series anomaly detection (TSAD) plays a vital role in many industrial applications.
Contrastive learning has gained momentum in the time series domain for its prowess in extracting meaningful representations from unlabeled data.
In this study, we propose a novel approach, CNT, that incorporates a window-based contrastive learning strategy fortified with learnable transformations.
- Score: 0.0
- License:
- Abstract: Time series anomaly detection (TSAD) plays a vital role in many industrial applications. While contrastive learning has gained momentum in the time series domain for its prowess in extracting meaningful representations from unlabeled data, its straightforward application to anomaly detection is not without hurdles. Firstly, contrastive learning typically requires negative sampling to avoid the representation collapse issue, where the encoder converges to a constant solution. However, drawing from the same dataset for dissimilar samples is ill-suited for TSAD as most samples are ``normal'' in the training dataset. Secondly, conventional contrastive learning focuses on instance discrimination, which may overlook anomalies that are detectable when compared to their temporal context. In this study, we propose a novel approach, CNT, that incorporates a window-based contrastive learning strategy fortified with learnable transformations. This dual configuration focuses on capturing temporal anomalies in local regions while simultaneously mitigating the representation collapse issue. Our theoretical analysis validates the effectiveness of CNT in circumventing constant encoder solutions. Through extensive experiments on diverse real-world industrial datasets, we show the superiority of our framework by outperforming various baselines and model variants.
Related papers
- Anomaly Detection by Context Contrasting [57.695202846009714]
Anomaly detection focuses on identifying samples that deviate from the norm.
Recent advances in self-supervised learning have shown great promise in this regard.
We propose Con$$, which learns through context augmentations.
arXiv Detail & Related papers (2024-05-29T07:59:06Z) - USD: Unsupervised Soft Contrastive Learning for Fault Detection in Multivariate Time Series [6.055410677780381]
We introduce a combination of data augmentation and soft contrastive learning, specifically designed to capture the multifaceted nature of state behaviors more accurately.
This dual strategy significantly boosts the model's ability to distinguish between normal and abnormal states, leading to a marked improvement in fault detection performance across multiple datasets and settings.
arXiv Detail & Related papers (2024-05-25T14:48:04Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - Video Anomaly Detection via Spatio-Temporal Pseudo-Anomaly Generation : A Unified Approach [49.995833831087175]
This work proposes a novel method for generating generic Video-temporal PAs by inpainting a masked out region of an image.
In addition, we present a simple unified framework to detect real-world anomalies under the OCC setting.
Our method performs on par with other existing state-of-the-art PAs generation and reconstruction based methods under the OCC setting.
arXiv Detail & Related papers (2023-11-27T13:14:06Z) - Unraveling the "Anomaly" in Time Series Anomaly Detection: A
Self-supervised Tri-domain Solution [89.16750999704969]
Anomaly labels hinder traditional supervised models in time series anomaly detection.
Various SOTA deep learning techniques, such as self-supervised learning, have been introduced to tackle this issue.
We propose a novel self-supervised learning based Tri-domain Anomaly Detector (TriAD)
arXiv Detail & Related papers (2023-11-19T05:37:18Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - DCdetector: Dual Attention Contrastive Representation Learning for Time
Series Anomaly Detection [26.042898544127503]
Time series anomaly detection is critical for a wide range of applications.
It aims to identify deviant samples from the normal sample distribution in time series.
We propose DCdetector, a multi-scale dual attention contrastive representation learning model.
arXiv Detail & Related papers (2023-06-17T13:40:15Z) - Time series anomaly detection with reconstruction-based state-space
models [10.085100442558828]
We propose a novel unsupervised anomaly detection method for time series data.
A long short-term memory (LSTM)-based encoder-decoder is adopted to represent the mapping between the observation space and the latent space.
Regularization of the latent space places constraints on the states of normal samples, and Mahalanobis distance is used to evaluate the abnormality level.
arXiv Detail & Related papers (2023-03-06T17:52:35Z) - Neural Contextual Anomaly Detection for Time Series [7.523820334642732]
We introduce Neural Contextual Anomaly Detection (NCAD), a framework for anomaly detection on time series.
NCAD scales seamlessly from the unsupervised to supervised setting.
We demonstrate empirically on standard benchmark datasets that our approach obtains a state-of-the-art performance.
arXiv Detail & Related papers (2021-07-16T04:33:53Z) - A Background-Agnostic Framework with Adversarial Training for Abnormal
Event Detection in Video [120.18562044084678]
Abnormal event detection in video is a complex computer vision problem that has attracted significant attention in recent years.
We propose a background-agnostic framework that learns from training videos containing only normal events.
arXiv Detail & Related papers (2020-08-27T18:39:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.