Empowering Time Series Analysis with Foundation Models: A Comprehensive Survey
- URL: http://arxiv.org/abs/2405.02358v4
- Date: Wed, 17 Sep 2025 09:01:42 GMT
- Title: Empowering Time Series Analysis with Foundation Models: A Comprehensive Survey
- Authors: Jiexia Ye, Yongzi Yu, Weiqi Zhang, Le Wang, Jia Li, Fugee Tsung,
- Abstract summary: Time series data are ubiquitous across diverse real-world applications.<n>Traditional approaches are largely task-specific, offering limited functionality and poor transferability.<n>Foundation models have revolutionized NLP and CV with their remarkable cross-task transferability.
- Score: 32.794229758722985
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series data are ubiquitous across diverse real-world applications, making time series analysis critically important. Traditional approaches are largely task-specific, offering limited functionality and poor transferability. In recent years, foundation models have revolutionized NLP and CV with their remarkable cross-task transferability, zero-/few-shot learning capabilities, and multimodal integration capacity. This success has motivated increasing efforts to explore foundation models for addressing time series modeling challenges. Although some tutorials and surveys were published in the early stages of this field, the rapid pace of recent developments necessitates a more comprehensive and in-depth synthesis to cover the latest advances. Our survey aims to fill this gap by introducing a modality-aware, challenge-oriented perspective, which reveals how foundation models pre-trained on different modalities face distinct hurdles when adapted to time series tasks. Building on this perspective, we propose a taxonomy of existing works organized by pre-training modality (time series, language, and vision), analyze modality-specific challenges and categorize corresponding solutions, discussing their advantages and limitations. Beyond this, we review real-world applications to illustrate domain-specific advancements, provide open-source codes, and conclude with potential future research directions in this rapidly evolving field.
Related papers
- Large Language models for Time Series Analysis: Techniques, Applications, and Challenges [10.347387584258222]
Large Language Models (LLMs) offer transformative potential by leveraging their cross-modal knowledge integration and inherent attention mechanisms for time series analysis.<n>This paper presents a systematic review of pre-trained LLM-driven time series analysis.<n>It focuses on enabling techniques, potential applications, and open challenges.
arXiv Detail & Related papers (2025-05-21T04:45:11Z) - A Survey of Efficient Reasoning for Large Reasoning Models: Language, Multimodality, and Beyond [88.5807076505261]
Large Reasoning Models (LRMs) have demonstrated strong performance gains by scaling up the length of Chain-of-Thought (CoT) reasoning during inference.<n>A growing concern lies in their tendency to produce excessively long reasoning traces.<n>This inefficiency introduces significant challenges for training, inference, and real-world deployment.
arXiv Detail & Related papers (2025-03-27T15:36:30Z) - Empowering Time Series Analysis with Synthetic Data: A Survey and Outlook in the Era of Foundation Models [104.17057231661371]
Time series analysis is crucial for understanding dynamics of complex systems.<n>Recent advances in foundation models have led to task-agnostic Time Series Foundation Models (TSFMs) and Large Language Model-based Time Series Models (TSLLMs)<n>Their success depends on large, diverse, and high-quality datasets, which are challenging to build due to regulatory, diversity, quality, and quantity constraints.<n>This survey provides a comprehensive review of synthetic data for TSFMs and TSLLMs, analyzing data generation strategies, their role in model pretraining, fine-tuning, and evaluation, and identifying future research directions.
arXiv Detail & Related papers (2025-03-14T13:53:46Z) - Deep Learning for Time Series Forecasting: A Survey [12.748035569833451]
We study the previous works and summarize the general paradigms of Deep Time Series Forecasting (DTSF) in terms of model architectures.
We take an innovative approach by focusing on the composition of time series and systematically explain important feature extraction methods.
arXiv Detail & Related papers (2025-03-13T09:32:01Z) - TimesBERT: A BERT-Style Foundation Model for Time Series Understanding [72.64824086839631]
GPT-style models have been positioned as foundation models for time series forecasting.
BERT-style architecture has not been fully unlocked for time series understanding.
We design TimesBERT to learn generic representations of time series.
Our model is pre-trained on 260 billion time points across diverse domains.
arXiv Detail & Related papers (2025-02-28T17:14:44Z) - Time-MQA: Time Series Multi-Task Question Answering with Context Enhancement [55.2439260314328]
Time Series Multi-Task Question Answering (Time-MQA) is a unified framework that enables natural language queries across multiple time series tasks.
Central to Time-MQA is the TSQA dataset, a large-scale dataset containing $sim $200k question-answer pairs.
arXiv Detail & Related papers (2025-02-26T13:47:13Z) - Harnessing Vision Models for Time Series Analysis: A Survey [85.65718718797643]
This survey discusses the advantages of vision models over LLMs in time series analysis.<n>It provides a comprehensive and in-depth overview of the existing methods, with dual views of detailed taxonomy.<n>We address the challenges in the pre- and post-processing steps involved in this framework.
arXiv Detail & Related papers (2025-02-13T00:42:11Z) - General Time-series Model for Universal Knowledge Representation of Multivariate Time-Series data [61.163542597764796]
We show that time series with different time granularities (or corresponding frequency resolutions) exhibit distinct joint distributions in the frequency domain.
A novel Fourier knowledge attention mechanism is proposed to enable learning time-aware representations from both the temporal and frequency domains.
An autoregressive blank infilling pre-training framework is incorporated to time series analysis for the first time, leading to a generative tasks agnostic pre-training strategy.
arXiv Detail & Related papers (2025-02-05T15:20:04Z) - Low-Rank Adaptation for Foundation Models: A Comprehensive Review [56.341827242332194]
Low-Rank Adaptation (LoRA) has emerged as a highly promising approach for mitigating these challenges.<n>This survey provides the first comprehensive review of LoRA techniques beyond large Language Models to general foundation models.
arXiv Detail & Related papers (2024-12-31T09:38:55Z) - Towards Generalisable Time Series Understanding Across Domains [10.350643783811174]
We introduce a novel pre-training paradigm specifically designed to handle time series heterogeneity.
We propose a tokeniser with learnable domain signatures, a dual masking strategy, and a normalised cross-correlation loss.
Our code and pre-trained weights are available at https://www.oetu.com/oetu/otis.
arXiv Detail & Related papers (2024-10-09T17:09:30Z) - TSI: A Multi-View Representation Learning Approach for Time Series Forecasting [29.05140751690699]
This study introduces a novel multi-view approach for time series forecasting.
It integrates trend and seasonal representations with an Independent Component Analysis (ICA)-based representation.
This approach offers a holistic understanding of time series data, going beyond traditional models that often miss nuanced, nonlinear relationships.
arXiv Detail & Related papers (2024-09-30T02:11:57Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - MOMENT: A Family of Open Time-series Foundation Models [19.0845213853369]
We introduce MOMENT, a family of open-source foundation models for general-purpose time series analysis.
We compile a collection of public time series, called the Time series Pile, and systematically tackle time series-specific challenges.
We build on recent work to design a benchmark to evaluate time series foundation models on diverse tasks and datasets in limited supervision settings.
arXiv Detail & Related papers (2024-02-06T10:48:46Z) - Empowering Time Series Analysis with Large Language Models: A Survey [24.202539098675953]
We provide a systematic overview of methods that leverage large language models for time series analysis.
Specifically, we first state the challenges and motivations of applying language models in the context of time series.
Next, we categorize existing methods into different groups (i.e., direct query, tokenization, prompt design, fine-tune, and model integration) and highlight the key ideas within each group.
arXiv Detail & Related papers (2024-02-05T16:46:35Z) - Position: What Can Large Language Models Tell Us about Time Series Analysis [69.70906014827547]
We argue that current large language models (LLMs) have the potential to revolutionize time series analysis.
Such advancement could unlock a wide range of possibilities, including time series modality switching and question answering.
arXiv Detail & Related papers (2024-02-05T04:17:49Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Fine-Grained Zero-Shot Learning: Advances, Challenges, and Prospects [84.36935309169567]
We present a broad review of recent advances for fine-grained analysis in zero-shot learning (ZSL)
We first provide a taxonomy of existing methods and techniques with a thorough analysis of each category.
Then, we summarize the benchmark, covering publicly available datasets, models, implementations, and some more details as a library.
arXiv Detail & Related papers (2024-01-31T11:51:24Z) - Large Models for Time Series and Spatio-Temporal Data: A Survey and
Outlook [95.32949323258251]
Temporal data, notably time series andtemporal-temporal data, are prevalent in real-world applications.
Recent advances in large language and other foundational models have spurred increased use in time series andtemporal data mining.
arXiv Detail & Related papers (2023-10-16T09:06:00Z) - Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [110.20279343734548]
Time series forecasting holds significant importance in many real-world dynamic systems.
We present Time-LLM, a reprogramming framework to repurpose large language models for time series forecasting.
Time-LLM is a powerful time series learner that outperforms state-of-the-art, specialized forecasting models.
arXiv Detail & Related papers (2023-10-03T01:31:25Z) - Time Series Analysis via Network Science: Concepts and Algorithms [62.997667081978825]
This review provides a comprehensive overview of existing mapping methods for transforming time series into networks.
We describe the main conceptual approaches, provide authoritative references and give insight into their advantages and limitations in a unified notation and language.
Although still very recent, this research area has much potential and with this survey we intend to pave the way for future research on the topic.
arXiv Detail & Related papers (2021-10-11T13:33:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.