CVTN: Cross Variable and Temporal Integration for Time Series Forecasting
- URL: http://arxiv.org/abs/2404.18730v1
- Date: Mon, 29 Apr 2024 14:16:16 GMT
- Title: CVTN: Cross Variable and Temporal Integration for Time Series Forecasting
- Authors: Han Zhou, Yuntian Chen,
- Abstract summary: This paper deconstructs time series forecasting into the learning of historical sequences and prediction sequences.
It divides time series forecasting into two phases: cross-variable learning for effectively mining fea tures from historical sequences, and cross-time learning to capture the temporal dependencies of prediction sequences.
- Score: 5.58591579080467
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In multivariate time series forecasting, the Transformer architecture encounters two significant challenges: effectively mining features from historical sequences and avoiding overfitting during the learning of temporal dependencies. To tackle these challenges, this paper deconstructs time series forecasting into the learning of historical sequences and prediction sequences, introducing the Cross-Variable and Time Network (CVTN). This unique method divides multivariate time series forecasting into two phases: cross-variable learning for effectively mining fea tures from historical sequences, and cross-time learning to capture the temporal dependencies of prediction sequences. Separating these two phases helps avoid the impact of overfitting in cross-time learning on cross-variable learning. Exten sive experiments on various real-world datasets have confirmed its state-of-the-art (SOTA) performance. CVTN emphasizes three key dimensions in time series fore casting: the short-term and long-term nature of time series (locality and longevity), feature mining from both historical and prediction sequences, and the integration of cross-variable and cross-time learning. This approach not only advances the current state of time series forecasting but also provides a more comprehensive framework for future research in this field.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Leveraging 2D Information for Long-term Time Series Forecasting with Vanilla Transformers [55.475142494272724]
Time series prediction is crucial for understanding and forecasting complex dynamics in various domains.
We introduce GridTST, a model that combines the benefits of two approaches using innovative multi-directional attentions.
The model consistently delivers state-of-the-art performance across various real-world datasets.
arXiv Detail & Related papers (2024-05-22T16:41:21Z) - FAITH: Frequency-domain Attention In Two Horizons for Time Series Forecasting [13.253624747448935]
Time Series Forecasting plays a crucial role in various fields such as industrial equipment maintenance, meteorology, energy consumption, traffic flow and financial investment.
Current deep learning-based predictive models often exhibit a significant deviation between their forecasting outcomes and the ground truth.
We propose a novel model Frequency-domain Attention In Two Horizons, which decomposes time series into trend and seasonal components.
arXiv Detail & Related papers (2024-05-22T02:37:02Z) - Multi-Scale Dilated Convolution Network for Long-Term Time Series Forecasting [17.132063819650355]
We propose Multi Scale Dilated Convolution Network (MSDCN) to capture the period and trend characteristics of long time series.
We design different convolution blocks with exponentially growing dilations and varying kernel sizes to sample time series data at different scales.
To validate the effectiveness of the proposed approach, we conduct experiments on eight challenging long-term time series forecasting benchmark datasets.
arXiv Detail & Related papers (2024-05-09T02:11:01Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - Cross-LKTCN: Modern Convolution Utilizing Cross-Variable Dependency for
Multivariate Time Series Forecasting Dependency for Multivariate Time Series
Forecasting [9.433527676880903]
Key to accurate forecasting results is capturing the long-term dependency between each time step.
Recent methods mainly focus on the cross-time dependency but seldom consider the cross-variable dependency.
We propose a modern pure convolution structure, namely Cross-LKTCN, to better utilize both cross-time and cross-variable dependency.
arXiv Detail & Related papers (2023-06-04T10:50:52Z) - Predicting the State of Synchronization of Financial Time Series using
Cross Recurrence Plots [75.20174445166997]
This study introduces a new method for predicting the future state of synchronization of the dynamics of two financial time series.
We adopt a deep learning framework for methodologically addressing the prediction of the synchronization state.
We find that the task of predicting the state of synchronization of two time series is in general rather difficult, but for certain pairs of stocks attainable with very satisfactory performance.
arXiv Detail & Related papers (2022-10-26T10:22:28Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - A Deep Structural Model for Analyzing Correlated Multivariate Time
Series [11.009809732645888]
We present a deep learning structural time series model which can handle correlated multivariate time series input.
The model explicitly learns/extracts the trend, seasonality, and event components.
We compare our model with several state-of-the-art methods through a comprehensive set of experiments on a variety of time series data sets.
arXiv Detail & Related papers (2020-01-02T18:48:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.