Time-series Anomaly Detection via Contextual Discriminative Contrastive
Learning
- URL: http://arxiv.org/abs/2304.07898v1
- Date: Sun, 16 Apr 2023 21:36:19 GMT
- Title: Time-series Anomaly Detection via Contextual Discriminative Contrastive
Learning
- Authors: Katrina Chen and Mingbin Feng and Tony S. Wirjanto
- Abstract summary: One-class classification methods are commonly used for anomaly detection tasks.
We propose a novel approach inspired by the loss function of DeepSVDD.
We combine our approach with a deterministic contrastive loss from Neutral AD, a promising self-supervised learning anomaly detection approach.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Detecting anomalies in temporal data is challenging due to anomalies being
dependent on temporal dynamics. One-class classification methods are commonly
used for anomaly detection tasks, but they have limitations when applied to
temporal data. In particular, mapping all normal instances into a single
hypersphere to capture their global characteristics can lead to poor
performance in detecting context-based anomalies where the abnormality is
defined with respect to local information. To address this limitation, we
propose a novel approach inspired by the loss function of DeepSVDD. Instead of
mapping all normal instances into a single hypersphere center, each normal
instance is pulled toward a recent context window. However, this approach is
prone to a representation collapse issue where the neural network that encodes
a given instance and its context is optimized towards a constant encoder
solution. To overcome this problem, we combine our approach with a
deterministic contrastive loss from Neutral AD, a promising self-supervised
learning anomaly detection approach. We provide a theoretical analysis to
demonstrate that the incorporation of the deterministic contrastive loss can
effectively prevent the occurrence of a constant encoder solution. Experimental
results show superior performance of our model over various baselines and model
variants on real-world industrial datasets.
Related papers
- Anomaly Detection by Context Contrasting [57.695202846009714]
Anomaly detection focuses on identifying samples that deviate from the norm.
Recent advances in self-supervised learning have shown great promise in this regard.
We propose Con$$, which learns through context augmentations.
arXiv Detail & Related papers (2024-05-29T07:59:06Z) - USD: Unsupervised Soft Contrastive Learning for Fault Detection in Multivariate Time Series [6.055410677780381]
We introduce a combination of data augmentation and soft contrastive learning, specifically designed to capture the multifaceted nature of state behaviors more accurately.
This dual strategy significantly boosts the model's ability to distinguish between normal and abnormal states, leading to a marked improvement in fault detection performance across multiple datasets and settings.
arXiv Detail & Related papers (2024-05-25T14:48:04Z) - Pattern-Based Time-Series Risk Scoring for Anomaly Detection and Alert Filtering -- A Predictive Maintenance Case Study [3.508168174653255]
We propose a fast and efficient approach to anomaly detection and alert filtering based on sequential pattern similarities.
We show how this approach can be leveraged for a variety of purposes involving anomaly detection on a large scale real-world industrial system.
arXiv Detail & Related papers (2024-05-24T20:27:45Z) - Video Anomaly Detection via Spatio-Temporal Pseudo-Anomaly Generation : A Unified Approach [49.995833831087175]
This work proposes a novel method for generating generic Video-temporal PAs by inpainting a masked out region of an image.
In addition, we present a simple unified framework to detect real-world anomalies under the OCC setting.
Our method performs on par with other existing state-of-the-art PAs generation and reconstruction based methods under the OCC setting.
arXiv Detail & Related papers (2023-11-27T13:14:06Z) - CARLA: Self-supervised Contrastive Representation Learning for Time Series Anomaly Detection [53.83593870825628]
One main challenge in time series anomaly detection (TSAD) is the lack of labelled data in many real-life scenarios.
Most of the existing anomaly detection methods focus on learning the normal behaviour of unlabelled time series in an unsupervised manner.
We introduce a novel end-to-end self-supervised ContrAstive Representation Learning approach for time series anomaly detection.
arXiv Detail & Related papers (2023-08-18T04:45:56Z) - Lossy Compression for Robust Unsupervised Time-Series Anomaly Detection [4.873362301533825]
We propose a Lossy Causal Temporal Convolutional Neural Network Autoencoder for anomaly detection.
Our framework uses a rate-distortion loss and an entropy bottleneck to learn a compressed latent representation for the task.
arXiv Detail & Related papers (2022-12-05T14:29:16Z) - Self-Supervised Training with Autoencoders for Visual Anomaly Detection [61.62861063776813]
We focus on a specific use case in anomaly detection where the distribution of normal samples is supported by a lower-dimensional manifold.
We adapt a self-supervised learning regime that exploits discriminative information during training but focuses on the submanifold of normal examples.
We achieve a new state-of-the-art result on the MVTec AD dataset -- a challenging benchmark for visual anomaly detection in the manufacturing domain.
arXiv Detail & Related papers (2022-06-23T14:16:30Z) - Explainable Deep Few-shot Anomaly Detection with Deviation Networks [123.46611927225963]
We introduce a novel weakly-supervised anomaly detection framework to train detection models.
The proposed approach learns discriminative normality by leveraging the labeled anomalies and a prior probability.
Our model is substantially more sample-efficient and robust, and performs significantly better than state-of-the-art competing methods in both closed-set and open-set settings.
arXiv Detail & Related papers (2021-08-01T14:33:17Z) - DASVDD: Deep Autoencoding Support Vector Data Descriptor for Anomaly
Detection [9.19194451963411]
Semi-supervised anomaly detection aims to detect anomalies from normal samples using a model that is trained on normal data.
We propose a method, DASVDD, that jointly learns the parameters of an autoencoder while minimizing the volume of an enclosing hyper-sphere on its latent representation.
arXiv Detail & Related papers (2021-06-09T21:57:41Z) - TadGAN: Time Series Anomaly Detection Using Generative Adversarial
Networks [73.01104041298031]
TadGAN is an unsupervised anomaly detection approach built on Generative Adversarial Networks (GANs)
To capture the temporal correlations of time series, we use LSTM Recurrent Neural Networks as base models for Generators and Critics.
To demonstrate the performance and generalizability of our approach, we test several anomaly scoring techniques and report the best-suited one.
arXiv Detail & Related papers (2020-09-16T15:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.