A Comprehensive Survey of Time Series Forecasting: Architectural Diversity and Open Challenges
- URL: http://arxiv.org/abs/2411.05793v1
- Date: Thu, 24 Oct 2024 07:43:55 GMT
- Title: A Comprehensive Survey of Time Series Forecasting: Architectural Diversity and Open Challenges
- Authors: Jongseon Kim, Hyungjoon Kim, HyunGi Kim, Dongjun Lee, Sungroh Yoon,
- Abstract summary: Time series forecasting is a critical task that provides key information for decision-making across various fields.
Deep learning architectures such as ass, CNNs, RNNs, and GNNs have been developed and applied to solve time series forecasting problems.
Transformer models, which excel at handling long-term dependencies, have become significant architectural components for time series forecasting.
- Score: 37.20655606514617
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series forecasting is a critical task that provides key information for decision-making across various fields. Recently, various fundamental deep learning architectures such as MLPs, CNNs, RNNs, and GNNs have been developed and applied to solve time series forecasting problems. However, the structural limitations caused by the inductive biases of each deep learning architecture constrained their performance. Transformer models, which excel at handling long-term dependencies, have become significant architectural components for time series forecasting. However, recent research has shown that alternatives such as simple linear layers can outperform Transformers. These findings have opened up new possibilities for using diverse architectures. In this context of exploration into various models, the architectural modeling of time series forecasting has now entered a renaissance. This survey not only provides a historical context for time series forecasting but also offers comprehensive and timely analysis of the movement toward architectural diversification. By comparing and re-examining various deep learning models, we uncover new perspectives and presents the latest trends in time series forecasting, including the emergence of hybrid models, diffusion models, Mamba models, and foundation models. By focusing on the inherent characteristics of time series data, we also address open challenges that have gained attention in time series forecasting, such as channel dependency, distribution shift, causality, and feature extraction. This survey explores vital elements that can enhance forecasting performance through diverse approaches. These contributions lead to lowering the entry barriers for newcomers to the field of time series forecasting, while also offering seasoned researchers broad perspectives, new opportunities, and deep insights.
Related papers
- Foundation Models for Time Series: A Survey [0.27835153780240135]
Transformer-based foundation models have emerged as a dominant paradigm in time series analysis.
This survey introduces a novel taxonomy to categorize them across several dimensions.
arXiv Detail & Related papers (2025-04-05T01:27:55Z) - Deep Learning for Time Series Forecasting: A Survey [12.748035569833451]
We study the previous works and summarize the general paradigms of Deep Time Series Forecasting (DTSF) in terms of model architectures.
We take an innovative approach by focusing on the composition of time series and systematically explain important feature extraction methods.
arXiv Detail & Related papers (2025-03-13T09:32:01Z) - TSI: A Multi-View Representation Learning Approach for Time Series Forecasting [29.05140751690699]
This study introduces a novel multi-view approach for time series forecasting.
It integrates trend and seasonal representations with an Independent Component Analysis (ICA)-based representation.
This approach offers a holistic understanding of time series data, going beyond traditional models that often miss nuanced, nonlinear relationships.
arXiv Detail & Related papers (2024-09-30T02:11:57Z) - Time Series Foundation Models and Deep Learning Architectures for Earthquake Temporal and Spatial Nowcasting [1.4854797901022863]
Existing literature on earthquake nowcasting lacks comprehensive evaluations of pre-trained foundation models.
We introduce two innovation approaches called MultiFoundationQuake and GNNCoder.
We formulate earthquake nowcasting as a time series forecasting problem for the next 14 days within 0.1-degree spatial bins in Southern California.
arXiv Detail & Related papers (2024-08-21T20:57:03Z) - Deep Time Series Models: A Comprehensive Survey and Benchmark [74.28364194333447]
Time series data is of great significance in real-world scenarios.
Recent years have witnessed remarkable breakthroughs in the time series community.
We release Time Series Library (TSLib) as a fair benchmark of deep time series models for diverse analysis tasks.
arXiv Detail & Related papers (2024-07-18T08:31:55Z) - Time Series Representation Models [2.724184832774005]
Time series analysis remains a major challenge due to its sparse characteristics, high dimensionality, and inconsistent data quality.
Recent advancements in transformer-based techniques have enhanced capabilities in forecasting and imputation.
We propose a new architectural concept for time series analysis based on introspection.
arXiv Detail & Related papers (2024-05-28T13:25:31Z) - A Survey on Diffusion Models for Time Series and Spatio-Temporal Data [92.1255811066468]
We review the use of diffusion models in time series and S-temporal data, categorizing them by model, task type, data modality, and practical application domain.
We categorize diffusion models into unconditioned and conditioned types discuss time series and S-temporal data separately.
Our survey covers their application extensively in various fields including healthcare, recommendation, climate, energy, audio, and transportation.
arXiv Detail & Related papers (2024-04-29T17:19:40Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - On the Resurgence of Recurrent Models for Long Sequences -- Survey and
Research Opportunities in the Transformer Era [59.279784235147254]
This survey is aimed at providing an overview of these trends framed under the unifying umbrella of Recurrence.
It emphasizes novel research opportunities that become prominent when abandoning the idea of processing long sequences.
arXiv Detail & Related papers (2024-02-12T23:55:55Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Rail Crack Propagation Forecasting Using Multi-horizons RNNs [0.46040036610482665]
The prediction of rail crack length propagation plays a crucial role in the maintenance and safety assessment of materials and structures.
Traditional methods rely on physical models and empirical equations such as Paris law.
In recent years, machine learning techniques, particularly Recurrent Neural Networks (RNNs), have emerged as promising methods for time series forecasting.
arXiv Detail & Related papers (2023-09-04T12:44:21Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - A Survey on Deep Learning based Time Series Analysis with Frequency Transformation [75.63783789488471]
Frequency transformation (FT) has been increasingly incorporated into deep learning models to enhance state-of-the-art accuracy and efficiency in time series analysis.
Despite the growing attention and the proliferation of research in this emerging field, there is currently a lack of a systematic review and in-depth analysis of deep learning-based time series models with FT.
We present a comprehensive review that systematically investigates and summarizes the recent research advancements in deep learning-based time series analysis with FT.
arXiv Detail & Related papers (2023-02-04T14:33:07Z) - Temporal Saliency Detection Towards Explainable Transformer-based
Timeseries Forecasting [3.046315755726937]
This paper introduces Temporal Saliency Detection (TSD), an effective approach that builds upon the attention mechanism and applies it to multi-horizon time series prediction.
The TSD approach facilitates the multiresolution analysis of saliency patterns by condensing multi-heads, thereby progressively enhancing the forecasting of complex time series data.
arXiv Detail & Related papers (2022-12-15T12:47:59Z) - Temporal-Spatial dependencies ENhanced deep learning model (TSEN) for
household leverage series forecasting [12.727583657383073]
Analyzing both temporal and spatial patterns for an accurate forecasting model for financial time series forecasting is a challenge.
Inspired by the successful applications of deep learning, we propose a new model to resolve the issues of forecasting household leverage in China.
Results show that the new approach can capture the temporal-spatial dynamics of household leverage well and get more accurate and solid predictive results.
arXiv Detail & Related papers (2022-10-17T00:10:25Z) - Monitoring Time Series With Missing Values: a Deep Probabilistic
Approach [1.90365714903665]
We introduce a new architecture for time series monitoring based on combination of state-of-the-art methods of forecasting in high-dimensional time series with full probabilistic handling of uncertainty.
We demonstrate advantage of the architecture for time series forecasting and novelty detection, in particular with partially missing data, and empirically evaluate and compare the architecture to state-of-the-art approaches on a real-world data set.
arXiv Detail & Related papers (2022-03-09T17:53:47Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.