Cross-LKTCN: Modern Convolution Utilizing Cross-Variable Dependency for
Multivariate Time Series Forecasting Dependency for Multivariate Time Series
Forecasting
- URL: http://arxiv.org/abs/2306.02326v1
- Date: Sun, 4 Jun 2023 10:50:52 GMT
- Title: Cross-LKTCN: Modern Convolution Utilizing Cross-Variable Dependency for
Multivariate Time Series Forecasting Dependency for Multivariate Time Series
Forecasting
- Authors: Donghao Luo, Xue Wang
- Abstract summary: Key to accurate forecasting results is capturing the long-term dependency between each time step.
Recent methods mainly focus on the cross-time dependency but seldom consider the cross-variable dependency.
We propose a modern pure convolution structure, namely Cross-LKTCN, to better utilize both cross-time and cross-variable dependency.
- Score: 9.433527676880903
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The past few years have witnessed the rapid development in multivariate time
series forecasting. The key to accurate forecasting results is capturing the
long-term dependency between each time step (cross-time dependency) and
modeling the complex dependency between each variable (cross-variable
dependency) in multivariate time series. However, recent methods mainly focus
on the cross-time dependency but seldom consider the cross-variable dependency.
To fill this gap, we find that convolution, a traditional technique but
recently losing steam in time series forecasting, meets the needs of
respectively capturing the cross-time and cross-variable dependency. Based on
this finding, we propose a modern pure convolution structure, namely
Cross-LKTCN, to better utilize both cross-time and cross-variable dependency
for time series forecasting. Specifically in each Cross-LKTCN block, a
depth-wise large kernel convolution with large receptive field is proposed to
capture cross-time dependency, and then two successive point-wise group
convolution feed forward networks are proposed to capture cross-variable
dependency. Experimental results on real-world benchmarks show that Cross-LKTCN
achieves state-of-the-art forecasting performance and improves the forecasting
accuracy significantly compared with existing convolutional-based models and
cross-variable methods.
Related papers
- TimeCNN: Refining Cross-Variable Interaction on Time Point for Time Series Forecasting [44.04862924157323]
Transformer-based models demonstrate significant potential in modeling cross-time and cross-variable interaction.
We propose a TimeCNN model to refine cross-variable interactions to enhance time series forecasting.
Extensive experiments conducted on 12 real-world datasets demonstrate that TimeCNN consistently outperforms state-of-the-art models.
arXiv Detail & Related papers (2024-10-07T09:16:58Z) - Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - ForecastGrapher: Redefining Multivariate Time Series Forecasting with Graph Neural Networks [9.006068771300377]
We present ForecastGrapher, a framework for capturing the intricate temporal dynamics and inter-series correlations.
Our approach is underpinned by three pivotal steps: generating custom node embeddings to reflect the temporal variations within each series; constructing an adaptive adjacency matrix to encode the inter-series correlations; and thirdly, augmenting the GNNs' expressive power by diversifying the node feature distribution.
arXiv Detail & Related papers (2024-05-28T10:40:20Z) - VCformer: Variable Correlation Transformer with Inherent Lagged Correlation for Multivariate Time Series Forecasting [1.5165632546654102]
We propose Variable Correlation Transformer (VCformer) to mine the correlations among variables.
VCA calculates and integrates the cross-correlation scores corresponding to different lags between queries and keys.
Inspired by Koopman dynamics theory, we also develop Koopman Temporal Detector (KTD) to better address the non-stationarity in time series.
arXiv Detail & Related papers (2024-05-19T07:39:22Z) - Multi-Scale Dilated Convolution Network for Long-Term Time Series Forecasting [17.132063819650355]
We propose Multi Scale Dilated Convolution Network (MSDCN) to capture the period and trend characteristics of long time series.
We design different convolution blocks with exponentially growing dilations and varying kernel sizes to sample time series data at different scales.
To validate the effectiveness of the proposed approach, we conduct experiments on eight challenging long-term time series forecasting benchmark datasets.
arXiv Detail & Related papers (2024-05-09T02:11:01Z) - CVTN: Cross Variable and Temporal Integration for Time Series Forecasting [5.58591579080467]
This paper deconstructs time series forecasting into the learning of historical sequences and prediction sequences.
It divides time series forecasting into two phases: cross-variable learning for effectively mining fea tures from historical sequences, and cross-time learning to capture the temporal dependencies of prediction sequences.
arXiv Detail & Related papers (2024-04-29T14:16:16Z) - CARD: Channel Aligned Robust Blend Transformer for Time Series
Forecasting [50.23240107430597]
We design a special Transformer, i.e., Channel Aligned Robust Blend Transformer (CARD for short), that addresses key shortcomings of CI type Transformer in time series forecasting.
First, CARD introduces a channel-aligned attention structure that allows it to capture both temporal correlations among signals.
Second, in order to efficiently utilize the multi-scale knowledge, we design a token blend module to generate tokens with different resolutions.
Third, we introduce a robust loss function for time series forecasting to alleviate the potential overfitting issue.
arXiv Detail & Related papers (2023-05-20T05:16:31Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - Instance-wise Graph-based Framework for Multivariate Time Series
Forecasting [69.38716332931986]
We propose a simple yet efficient instance-wise graph-based framework to utilize the inter-dependencies of different variables at different time stamps.
The key idea of our framework is aggregating information from the historical time series of different variables to the current time series that we need to forecast.
arXiv Detail & Related papers (2021-09-14T07:38:35Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.