Discrete Graph Structure Learning for Forecasting Multiple Time Series
- URL: http://arxiv.org/abs/2101.06861v2
- Date: Mon, 15 Feb 2021 20:21:28 GMT
- Title: Discrete Graph Structure Learning for Forecasting Multiple Time Series
- Authors: Chao Shang, Jie Chen, Jinbo Bi
- Abstract summary: Time series forecasting is an extensively studied subject in statistics, economics, and computer science.
In this work, we propose learning the structure simultaneously with a graph neural network (GNN) if the graph is unknown.
Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning.
- Score: 14.459541930646205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Time series forecasting is an extensively studied subject in statistics,
economics, and computer science. Exploration of the correlation and causation
among the variables in a multivariate time series shows promise in enhancing
the performance of a time series model. When using deep neural networks as
forecasting models, we hypothesize that exploiting the pairwise information
among multiple (multivariate) time series also improves their forecast. If an
explicit graph structure is known, graph neural networks (GNNs) have been
demonstrated as powerful tools to exploit the structure. In this work, we
propose learning the structure simultaneously with the GNN if the graph is
unknown. We cast the problem as learning a probabilistic graph model through
optimizing the mean performance over the graph distribution. The distribution
is parameterized by a neural network so that discrete graphs can be sampled
differentiably through reparameterization. Empirical evaluations show that our
method is simpler, more efficient, and better performing than a recently
proposed bilevel learning approach for graph structure learning, as well as a
broad array of forecasting models, either deep or non-deep learning based, and
graph or non-graph based.
Related papers
- TimeGNN: Temporal Dynamic Graph Learning for Time Series Forecasting [20.03223916749058]
Time series forecasting lies at the core of important real-world applications in science and engineering.
We propose TimeGNN, a method that learns dynamic temporal graph representations.
TimeGNN achieves inference times 4 to 80 times faster than other state-of-the-art graph-based methods.
arXiv Detail & Related papers (2023-07-27T08:10:19Z) - Sparsity exploitation via discovering graphical models in multi-variate
time-series forecasting [1.2762298148425795]
We propose a decoupled training method, which includes a graph generating module and a GNNs forecasting module.
First, we use Graphical Lasso (or GraphLASSO) to directly exploit the sparsity pattern from data to build graph structures.
Second, we fit these graph structures and the input data into a Graph Convolutional Recurrent Network (GCRN) to train a forecasting model.
arXiv Detail & Related papers (2023-06-29T16:48:00Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting [5.359968374560132]
Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
arXiv Detail & Related papers (2022-01-24T13:35:37Z) - Non-Parametric Graph Learning for Bayesian Graph Neural Networks [35.88239188555398]
We propose a novel non-parametric graph model for constructing the posterior distribution of graph adjacency matrices.
We demonstrate the advantages of this model in three different problem settings: node classification, link prediction and recommendation.
arXiv Detail & Related papers (2020-06-23T21:10:55Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.