Time Series Foundational Models: Their Role in Anomaly Detection and Prediction
- URL: http://arxiv.org/abs/2412.19286v1
- Date: Thu, 26 Dec 2024 17:15:30 GMT
- Title: Time Series Foundational Models: Their Role in Anomaly Detection and Prediction
- Authors: Chathurangi Shyalika, Harleen Kaur Bagga, Ahan Bhatt, Renjith Prasad, Alaa Al Ghazo, Amit Sheth,
- Abstract summary: Time series foundational models (TSFM) have gained prominence in time series forecasting.
This paper critically evaluates the efficacy of TSFM in anomaly detection and prediction tasks.
- Score: 0.0
- License:
- Abstract: Time series foundational models (TSFM) have gained prominence in time series forecasting, promising state-of-the-art performance across various applications. However, their application in anomaly detection and prediction remains underexplored, with growing concerns regarding their black-box nature, lack of interpretability and applicability. This paper critically evaluates the efficacy of TSFM in anomaly detection and prediction tasks. We systematically analyze TSFM across multiple datasets, including those characterized by the absence of discernible patterns, trends and seasonality. Our analysis shows that while TSFMs can be extended for anomaly detection and prediction, traditional statistical and deep learning models often match or outperform TSFM in these tasks. Additionally, TSFMs require high computational resources but fail to capture sequential dependencies effectively or improve performance in few-shot or zero-shot scenarios. \noindent The preprocessed datasets, codes to reproduce the results and supplementary materials are available at https://github.com/smtmnfg/TSFM.
Related papers
- Evaluating Time Series Foundation Models on Noisy Periodic Time Series [0.0]
This paper presents an empirical study evaluating the performance of time series foundation models (TSFMs) over two datasets constituting noisy periodic time series.
Our findings demonstrate that while for time series with bounded periods, TSFMs can match or outperform the statistical approaches, their forecasting abilities deteriorate with longer periods, higher noise levels, lower sampling rates and more complex shapes of the time series.
arXiv Detail & Related papers (2025-01-01T16:36:21Z) - Are Large Language Models Useful for Time Series Data Analysis? [3.44393516559102]
Time series data plays a critical role across diverse domains such as healthcare, energy, and finance.
This study investigates whether large language models (LLMs) are effective for time series data analysis.
arXiv Detail & Related papers (2024-12-16T02:47:44Z) - Rating Multi-Modal Time-Series Forecasting Models (MM-TSFM) for Robustness Through a Causal Lens [10.103561529332184]
We focus on multi-modal time-series forecasting, where imprecision due to noisy or incorrect data can lead to erroneous predictions.
We introduce a rating methodology to assess the robustness of Multi-Modal Time-Series Forecasting Models.
arXiv Detail & Related papers (2024-06-12T17:39:16Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - SEMF: Supervised Expectation-Maximization Framework for Predicting Intervals [0.8192907805418583]
Supervised Expectation-Maximization Framework (SEMF) is a versatile and model-agnostic approach for generating prediction intervals in datasets with complete or missing data.
SEMF extends the Expectation-Maximization algorithm, traditionally used in unsupervised learning, to a supervised context, leveraging latent variable modeling for uncertainty estimation.
arXiv Detail & Related papers (2024-05-28T13:43:34Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - MTS-Mixers: Multivariate Time Series Forecasting via Factorized Temporal
and Channel Mixing [18.058617044421293]
This paper investigates the contributions and deficiencies of attention mechanisms on the performance of time series forecasting.
We propose MTS-Mixers, which use two factorized modules to capture temporal and channel dependencies.
Experimental results on several real-world datasets show that MTS-Mixers outperform existing Transformer-based models with higher efficiency.
arXiv Detail & Related papers (2023-02-09T08:52:49Z) - DiffSTG: Probabilistic Spatio-Temporal Graph Forecasting with Denoising
Diffusion Models [53.67562579184457]
This paper focuses on probabilistic STG forecasting, which is challenging due to the difficulty in modeling uncertainties and complex dependencies.
We present the first attempt to generalize the popular denoising diffusion models to STGs, leading to a novel non-autoregressive framework called DiffSTG.
Our approach combines the intrinsic-temporal learning capabilities STNNs with the uncertainty measurements of diffusion models.
arXiv Detail & Related papers (2023-01-31T13:42:36Z) - Spatio-temporal predictive tasks for abnormal event detection in videos [60.02503434201552]
We propose new constrained pretext tasks to learn object level normality patterns.
Our approach consists in learning a mapping between down-scaled visual queries and their corresponding normal appearance and motion characteristics.
Experiments on several benchmark datasets demonstrate the effectiveness of our approach to localize and track anomalies.
arXiv Detail & Related papers (2022-10-27T19:45:12Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - A Multi-Channel Neural Graphical Event Model with Negative Evidence [76.51278722190607]
Event datasets are sequences of events of various types occurring irregularly over the time-line.
We propose a non-parametric deep neural network approach in order to estimate the underlying intensity functions.
arXiv Detail & Related papers (2020-02-21T23:10:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.