Foundation Models for Time Series: A Survey
- URL: http://arxiv.org/abs/2504.04011v1
- Date: Sat, 05 Apr 2025 01:27:55 GMT
- Title: Foundation Models for Time Series: A Survey
- Authors: Siva Rama Krishna Kottapalli, Karthik Hubli, Sandeep Chandrashekhara, Garima Jain, Sunayana Hubli, Gayathri Botla, Ramesh Doddaiah,
- Abstract summary: Transformer-based foundation models have emerged as a dominant paradigm in time series analysis.<n>This survey introduces a novel taxonomy to categorize them across several dimensions.
- Score: 0.27835153780240135
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Transformer-based foundation models have emerged as a dominant paradigm in time series analysis, offering unprecedented capabilities in tasks such as forecasting, anomaly detection, classification, trend analysis and many more time series analytical tasks. This survey provides a comprehensive overview of the current state of the art pre-trained foundation models, introducing a novel taxonomy to categorize them across several dimensions. Specifically, we classify models by their architecture design, distinguishing between those leveraging patch-based representations and those operating directly on raw sequences. The taxonomy further includes whether the models provide probabilistic or deterministic predictions, and whether they are designed to work with univariate time series or can handle multivariate time series out of the box. Additionally, the taxonomy encompasses model scale and complexity, highlighting differences between lightweight architectures and large-scale foundation models. A unique aspect of this survey is its categorization by the type of objective function employed during training phase. By synthesizing these perspectives, this survey serves as a resource for researchers and practitioners, providing insights into current trends and identifying promising directions for future research in transformer-based time series modeling.
Related papers
- Harnessing Vision Models for Time Series Analysis: A Survey [72.09716244582684]
This survey discusses the advantages of vision models over LLMs in time series analysis.<n>It provides a comprehensive and in-depth overview of the existing methods, with dual views of detailed taxonomy.<n>We address the challenges in the pre- and post-processing steps involved in this framework.
arXiv Detail & Related papers (2025-02-13T00:42:11Z) - VSFormer: Value and Shape-Aware Transformer with Prior-Enhanced Self-Attention for Multivariate Time Series Classification [47.92529531621406]
We propose a novel method, VSFormer, that incorporates both discriminative patterns (shape) and numerical information (value)
In addition, we extract class-specific prior information derived from supervised information to enrich the positional encoding.
Extensive experiments on all 30 UEA archived datasets demonstrate the superior performance of our method compared to SOTA models.
arXiv Detail & Related papers (2024-12-21T07:31:22Z) - A Comprehensive Survey of Time Series Forecasting: Architectural Diversity and Open Challenges [37.20655606514617]
Time series forecasting is a critical task that provides key information for decision-making across various fields.<n>Deep learning architectures such as ass, CNNs, RNNs, and GNNs have been developed and applied to solve time series forecasting problems.<n> Transformer models, which excel at handling long-term dependencies, have become significant architectural components for time series forecasting.
arXiv Detail & Related papers (2024-10-24T07:43:55Z) - Analyzing Deep Transformer Models for Time Series Forecasting via Manifold Learning [4.910937238451485]
Transformer models have consistently achieved remarkable results in various domains such as natural language processing and computer vision.
Despite ongoing research efforts to better understand these models, the field still lacks a comprehensive understanding.
Time series data, unlike image and text information, can be more challenging to interpret and analyze.
arXiv Detail & Related papers (2024-10-17T17:32:35Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - A Survey on Diffusion Models for Time Series and Spatio-Temporal Data [92.1255811066468]
We review the use of diffusion models in time series and S-temporal data, categorizing them by model, task type, data modality, and practical application domain.
We categorize diffusion models into unconditioned and conditioned types discuss time series and S-temporal data separately.
Our survey covers their application extensively in various fields including healthcare, recommendation, climate, energy, audio, and transportation.
arXiv Detail & Related papers (2024-04-29T17:19:40Z) - Analysis and modeling to forecast in time series: a systematic review [0.0]
This paper surveys state-of-the-art methods and models dedicated to time series analysis and modeling, with the final aim of prediction.
This review aims to offer a structured and comprehensive view of the full process flow, and encompasses time series decomposition, stationary tests, modeling and forecasting.
arXiv Detail & Related papers (2021-03-31T23:48:46Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - A Survey on Principles, Models and Methods for Learning from Irregularly
Sampled Time Series [18.224344440110862]
Irregularly sampled time series data arise naturally in many application domains including biology, ecology, climate science, astronomy, and health.
We first describe several axes along which approaches to learning from irregularly sampled time series differ.
We then survey the recent literature organized primarily along the axis of modeling primitives.
arXiv Detail & Related papers (2020-11-30T23:41:47Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.