Detecting Anomalous Event Sequences with Temporal Point Processes
- URL: http://arxiv.org/abs/2106.04465v1
- Date: Tue, 8 Jun 2021 15:50:12 GMT
- Title: Detecting Anomalous Event Sequences with Temporal Point Processes
- Authors: Oleksandr Shchur, Ali Caner T\"urkmen, Tim Januschowski, Jan Gasthaus,
Stephan G\"unnemann
- Abstract summary: We frame the problem of detecting anomalous continuous-time event sequences as out-of-distribution (OoD) detection for temporal point processes (TPPs)
First, we show how this problem can be approached using goodness-of-fit (GoF) tests.
We then demonstrate the limitations of popular GoF statistics for TPPs and propose a new test that addresses these shortcomings.
- Score: 28.997992932163008
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Automatically detecting anomalies in event data can provide substantial value
in domains such as healthcare, DevOps, and information security. In this paper,
we frame the problem of detecting anomalous continuous-time event sequences as
out-of-distribution (OoD) detection for temporal point processes (TPPs). First,
we show how this problem can be approached using goodness-of-fit (GoF) tests.
We then demonstrate the limitations of popular GoF statistics for TPPs and
propose a new test that addresses these shortcomings. The proposed method can
be combined with various TPP models, such as neural TPPs, and is easy to
implement. In our experiments, we show that the proposed statistic excels at
both traditional GoF testing, as well as at detecting anomalies in simulated
and real-world data.
Related papers
- Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - SMURF-THP: Score Matching-based UnceRtainty quantiFication for
Transformer Hawkes Process [76.98721879039559]
We propose SMURF-THP, a score-based method for learning Transformer Hawkes process and quantifying prediction uncertainty.
Specifically, SMURF-THP learns the score function of events' arrival time based on a score-matching objective.
We conduct extensive experiments in both event type prediction and uncertainty quantification of arrival time.
arXiv Detail & Related papers (2023-10-25T03:33:45Z) - Deep Representation Learning for Prediction of Temporal Event Sets in
the Continuous Time Domain [9.71405768795797]
Temporal Point Processes play an important role in predicting or forecasting events.
We propose a scalable and efficient approach based on TPPs to solve this problem.
arXiv Detail & Related papers (2023-09-29T06:46:31Z) - Sequential Kernelized Independence Testing [101.22966794822084]
We design sequential kernelized independence tests inspired by kernelized dependence measures.
We demonstrate the power of our approaches on both simulated and real data.
arXiv Detail & Related papers (2022-12-14T18:08:42Z) - Anomaly Detection with Test Time Augmentation and Consistency Evaluation [13.709281244889691]
We propose a simple, yet effective anomaly detection algorithm named Test Time Augmentation Anomaly Detection (TTA-AD)
We observe that in-distribution data enjoy more consistent predictions for its original and augmented versions on a trained network than out-distribution data.
Experiments on various high-resolution image benchmark datasets demonstrate that TTA-AD achieves comparable or better detection performance.
arXiv Detail & Related papers (2022-06-06T04:27:06Z) - Efficient Test-Time Model Adaptation without Forgetting [60.36499845014649]
Test-time adaptation seeks to tackle potential distribution shifts between training and testing data.
We propose an active sample selection criterion to identify reliable and non-redundant samples.
We also introduce a Fisher regularizer to constrain important model parameters from drastic changes.
arXiv Detail & Related papers (2022-04-06T06:39:40Z) - Transformers Can Do Bayesian Inference [56.99390658880008]
We present Prior-Data Fitted Networks (PFNs)
PFNs leverage in-context learning in large-scale machine learning techniques to approximate a large set of posteriors.
We demonstrate that PFNs can near-perfectly mimic Gaussian processes and also enable efficient Bayesian inference for intractable problems.
arXiv Detail & Related papers (2021-12-20T13:07:39Z) - Using sequential drift detection to test the API economy [4.056434158960926]
API economy refers to the widespread integration of API (advanced programming interface)
It is desirable to monitor the usage patterns and identify when the system is used in a way that was never used before.
In this work we analyze both histograms and call graph of API usage to determine if the usage patterns of the system has shifted.
arXiv Detail & Related papers (2021-11-09T13:24:19Z) - Recomposition vs. Prediction: A Novel Anomaly Detection for Discrete
Events Based On Autoencoder [5.781280693720236]
One of the most challenging problems in the field of intrusion detection is anomaly detection for discrete event logs.
We propose DabLog, a Deep Autoencoder-Based anomaly detection method for discrete event Logs.
Our approach determines whether a sequence is normal or abnormal by analyzing (encoding) and reconstructing (decoding) the given sequence.
arXiv Detail & Related papers (2020-12-27T16:31:05Z) - Few-Shot Event Detection with Prototypical Amortized Conditional Random
Field [8.782210889586837]
Event Detection tends to struggle when it needs to recognize novel event types with a few samples.
We present a novel unified joint model which converts the task to a few-shot tagging problem with a double-part tagging scheme.
We conduct experiments on the benchmark dataset FewEvent and the experimental results show that the tagging based methods are better than existing pipeline and joint learning methods.
arXiv Detail & Related papers (2020-12-04T01:11:13Z) - Change Point Detection in Time Series Data using Autoencoders with a
Time-Invariant Representation [69.34035527763916]
Change point detection (CPD) aims to locate abrupt property changes in time series data.
Recent CPD methods demonstrated the potential of using deep learning techniques, but often lack the ability to identify more subtle changes in the autocorrelation statistics of the signal.
We employ an autoencoder-based methodology with a novel loss function, through which the used autoencoders learn a partially time-invariant representation that is tailored for CPD.
arXiv Detail & Related papers (2020-08-21T15:03:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.