Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting
- URL: http://arxiv.org/abs/2201.09686v1
- Date: Mon, 24 Jan 2022 13:35:37 GMT
- Title: Learning Sparse and Continuous Graph Structures for Multivariate Time
Series Forecasting
- Authors: Weijun Chen, Yanze Wang, Chengshuo Du, Zhenglong Jia, Feng Liu and Ran
Chen
- Abstract summary: Learning Sparse and Continuous Graphs for Forecasting (LSCGF) is a novel deep learning model that joins graph learning and forecasting.
In this paper, we propose a brand new method named Smooth Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Our model achieves state-of-the-art performances with minor trainable parameters.
- Score: 5.359968374560132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate forecasting of multivariate time series is an extensively studied
subject in finance, transportation, and computer science. Fully mining the
correlation and causation between the variables in a multivariate time series
exhibits noticeable results in improving the performance of a time series
model. Recently, some models have explored the dependencies between variables
through end-to-end graph structure learning without the need for pre-defined
graphs. However, most current models do not incorporate the trade-off between
effectiveness and flexibility and lack the guidance of domain knowledge in the
design of graph learning algorithms. Besides, they have issues generating
sparse graph structures, which pose challenges to end-to-end learning. In this
paper, we propose Learning Sparse and Continuous Graphs for Forecasting
(LSCGF), a novel deep learning model that joins graph learning and forecasting.
Technically, LSCGF leverages the spatial information into convolutional
operation and extracts temporal dynamics using the diffusion convolution
recurrent network. At the same time, we propose a brand new method named Smooth
Sparse Unit (SSU) to learn sparse and continuous graph adjacency matrix.
Extensive experiments on three real-world datasets demonstrate that our model
achieves state-of-the-art performances with minor trainable parameters.
Related papers
- TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - TimeGNN: Temporal Dynamic Graph Learning for Time Series Forecasting [20.03223916749058]
Time series forecasting lies at the core of important real-world applications in science and engineering.
We propose TimeGNN, a method that learns dynamic temporal graph representations.
TimeGNN achieves inference times 4 to 80 times faster than other state-of-the-art graph-based methods.
arXiv Detail & Related papers (2023-07-27T08:10:19Z) - Sparsity exploitation via discovering graphical models in multi-variate
time-series forecasting [1.2762298148425795]
We propose a decoupled training method, which includes a graph generating module and a GNNs forecasting module.
First, we use Graphical Lasso (or GraphLASSO) to directly exploit the sparsity pattern from data to build graph structures.
Second, we fit these graph structures and the input data into a Graph Convolutional Recurrent Network (GCRN) to train a forecasting model.
arXiv Detail & Related papers (2023-06-29T16:48:00Z) - TodyNet: Temporal Dynamic Graph Neural Network for Multivariate Time
Series Classification [6.76723360505692]
We propose a novel temporal dynamic neural graph network (TodyNet) that can extract hidden-temporal dependencies without undefined graph structure.
The experiments on 26 UEA benchmark datasets illustrate that the proposed TodyNet outperforms existing deep learning-based methods in the MTSC tasks.
arXiv Detail & Related papers (2023-04-11T09:21:28Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Discrete Graph Structure Learning for Forecasting Multiple Time Series [14.459541930646205]
Time series forecasting is an extensively studied subject in statistics, economics, and computer science.
In this work, we propose learning the structure simultaneously with a graph neural network (GNN) if the graph is unknown.
Empirical evaluations show that our method is simpler, more efficient, and better performing than a recently proposed bilevel learning approach for graph structure learning.
arXiv Detail & Related papers (2021-01-18T03:36:33Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.