Deep Coupling Network For Multivariate Time Series Forecasting
- URL: http://arxiv.org/abs/2402.15134v1
- Date: Fri, 23 Feb 2024 06:38:08 GMT
- Title: Deep Coupling Network For Multivariate Time Series Forecasting
- Authors: Kun Yi, Qi Zhang, Hui He, Kaize Shi, Liang Hu, Ning An, Zhendong Niu
- Abstract summary: We propose a novel deep coupling network for MTS forecasting, named DeepCN.
Our proposed DeepCN achieves superior performance compared with the state-of-the-art baselines.
- Score: 24.01637416183444
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate time series (MTS) forecasting is crucial in many real-world
applications. To achieve accurate MTS forecasting, it is essential to
simultaneously consider both intra- and inter-series relationships among time
series data. However, previous work has typically modeled intra- and
inter-series relationships separately and has disregarded multi-order
interactions present within and between time series data, which can seriously
degrade forecasting accuracy. In this paper, we reexamine intra- and
inter-series relationships from the perspective of mutual information and
accordingly construct a comprehensive relationship learning mechanism tailored
to simultaneously capture the intricate multi-order intra- and inter-series
couplings. Based on the mechanism, we propose a novel deep coupling network for
MTS forecasting, named DeepCN, which consists of a coupling mechanism dedicated
to explicitly exploring the multi-order intra- and inter-series relationships
among time series data concurrently, a coupled variable representation module
aimed at encoding diverse variable patterns, and an inference module
facilitating predictions through one forward step. Extensive experiments
conducted on seven real-world datasets demonstrate that our proposed DeepCN
achieves superior performance compared with the state-of-the-art baselines.
Related papers
- Multi-Knowledge Fusion Network for Time Series Representation Learning [2.368662284133926]
We propose a hybrid architecture that combines prior knowledge with implicit knowledge of the relational structure within the MTS data.
The proposed architecture has shown promising results on multiple benchmark datasets and outperforms state-of-the-art forecasting methods by a significant margin.
arXiv Detail & Related papers (2024-08-22T14:18:16Z) - Multi-Source Knowledge-Based Hybrid Neural Framework for Time Series Representation Learning [2.368662284133926]
The proposed hybrid architecture addresses limitations by combining both domain-specific knowledge and implicit knowledge of the relational structure underlying the MTS data.
The architecture shows promising results on multiple benchmark datasets, outperforming state-of-the-art forecasting methods.
arXiv Detail & Related papers (2024-08-22T13:58:55Z) - UniTST: Effectively Modeling Inter-Series and Intra-Series Dependencies for Multivariate Time Series Forecasting [98.12558945781693]
We propose a transformer-based model UniTST containing a unified attention mechanism on the flattened patch tokens.
Although our proposed model employs a simple architecture, it offers compelling performance as shown in our experiments on several datasets for time series forecasting.
arXiv Detail & Related papers (2024-06-07T14:39:28Z) - MGCP: A Multi-Grained Correlation based Prediction Network for Multivariate Time Series [54.91026286579748]
We propose a Multi-Grained Correlations-based Prediction Network.
It simultaneously considers correlations at three levels to enhance prediction performance.
It employs adversarial training with an attention mechanism-based predictor and conditional discriminator to optimize prediction results at coarse-grained level.
arXiv Detail & Related papers (2024-05-30T03:32:44Z) - MSGNet: Learning Multi-Scale Inter-Series Correlations for Multivariate
Time Series Forecasting [18.192600104502628]
Time series data often exhibit diverse intra-series and inter-series correlations.
Extensive experiments are conducted on several real-world datasets to showcase the effectiveness of MSGNet.
arXiv Detail & Related papers (2023-12-31T08:23:24Z) - Multimodal Learning Without Labeled Multimodal Data: Guarantees and Applications [90.6849884683226]
We study the challenge of interaction quantification in a semi-supervised setting with only labeled unimodal data.
Using a precise information-theoretic definition of interactions, our key contribution is the derivation of lower and upper bounds.
We show how these theoretical results can be used to estimate multimodal model performance, guide data collection, and select appropriate multimodal models for various tasks.
arXiv Detail & Related papers (2023-06-07T15:44:53Z) - Robust Detection of Lead-Lag Relationships in Lagged Multi-Factor Models [61.10851158749843]
Key insights can be obtained by discovering lead-lag relationships inherent in the data.
We develop a clustering-driven methodology for robust detection of lead-lag relationships in lagged multi-factor models.
arXiv Detail & Related papers (2023-05-11T10:30:35Z) - IPCC-TP: Utilizing Incremental Pearson Correlation Coefficient for Joint
Multi-Agent Trajectory Prediction [73.25645602768158]
IPCC-TP is a novel relevance-aware module based on Incremental Pearson Correlation Coefficient to improve multi-agent interaction modeling.
Our module can be conveniently embedded into existing multi-agent prediction methods to extend original motion distribution decoders.
arXiv Detail & Related papers (2023-03-01T15:16:56Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Pay Attention to Evolution: Time Series Forecasting with Deep
Graph-Evolution Learning [33.79957892029931]
This work presents a novel neural network architecture for time-series forecasting.
We named our method Recurrent Graph Evolution Neural Network (ReGENN)
An extensive set of experiments was conducted comparing ReGENN with dozens of ensemble methods and classical statistical ones.
arXiv Detail & Related papers (2020-08-28T20:10:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.