Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks
- URL: http://arxiv.org/abs/2005.11650v1
- Date: Sun, 24 May 2020 04:02:18 GMT
- Title: Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks
- Authors: Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Xiaojun Chang,
Chengqi Zhang
- Abstract summary: We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
- Score: 91.65637773358347
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modeling multivariate time series has long been a subject that has attracted
researchers from a diverse range of fields including economics, finance, and
traffic. A basic assumption behind multivariate time series forecasting is that
its variables depend on one another but, upon looking closely, it is fair to
say that existing methods fail to fully exploit latent spatial dependencies
between pairs of variables. In recent years, meanwhile, graph neural networks
(GNNs) have shown high capability in handling relational dependencies. GNNs
require well-defined graph structures for information propagation which means
they cannot be applied directly for multivariate time series where the
dependencies are not known in advance. In this paper, we propose a general
graph neural network framework designed specifically for multivariate time
series data. Our approach automatically extracts the uni-directed relations
among variables through a graph learning module, into which external knowledge
like variable attributes can be easily integrated. A novel mix-hop propagation
layer and a dilated inception layer are further proposed to capture the spatial
and temporal dependencies within the time series. The graph learning, graph
convolution, and temporal convolution modules are jointly learned in an
end-to-end framework. Experimental results show that our proposed model
outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets
and achieves on-par performance with other approaches on two traffic datasets
which provide extra structural information.
Related papers
- Fully-Connected Spatial-Temporal Graph for Multivariate Time-Series Data [50.84488941336865]
We propose a novel method called Fully- Spatial-Temporal Graph Neural Network (FC-STGNN)
For graph construction, we design a decay graph to connect sensors across all timestamps based on their temporal distances.
For graph convolution, we devise FC graph convolution with a moving-pooling GNN layer to effectively capture the ST dependencies for learning effective representations.
arXiv Detail & Related papers (2023-09-11T08:44:07Z) - TimeGNN: Temporal Dynamic Graph Learning for Time Series Forecasting [20.03223916749058]
Time series forecasting lies at the core of important real-world applications in science and engineering.
We propose TimeGNN, a method that learns dynamic temporal graph representations.
TimeGNN achieves inference times 4 to 80 times faster than other state-of-the-art graph-based methods.
arXiv Detail & Related papers (2023-07-27T08:10:19Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting [5.359968374560132]
Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
arXiv Detail & Related papers (2022-01-24T13:35:37Z) - Instance-wise Graph-based Framework for Multivariate Time Series
Forecasting [69.38716332931986]
We propose a simple yet efficient instance-wise graph-based framework to utilize the inter-dependencies of different variables at different time stamps.
The key idea of our framework is aggregating information from the historical time series of different variables to the current time series that we need to forecast.
arXiv Detail & Related papers (2021-09-14T07:38:35Z) - Discrete Graph Structure Learning for Forecasting Multiple Time Series [14.459541930646205]
Time series forecasting is an extensively studied subject in statistics, economics, and computer science.
In this work, we propose learning the structure simultaneously with a graph neural network (GNN) if the graph is unknown.
Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning.
arXiv Detail & Related papers (2021-01-18T03:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.