Scalable Spatiotemporal Graph Neural Networks
- URL: http://arxiv.org/abs/2209.06520v1
- Date: Wed, 14 Sep 2022 09:47:38 GMT
- Title: Scalable Spatiotemporal Graph Neural Networks
- Authors: Andrea Cini, Ivan Marisca, Filippo Maria Bianchi, Cesare Alippi
- Abstract summary: Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
- Score: 14.415967477487692
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural forecasting of spatiotemporal time series drives both research and
industrial innovation in several relevant application domains. Graph neural
networks (GNNs) are often the core component of the forecasting architecture.
However, in most spatiotemporal GNNs, the computational complexity scales up to
a quadratic factor with the length of the sequence times the number of links in
the graph, hence hindering the application of these models to large graphs and
long temporal sequences. While methods to improve scalability have been
proposed in the context of static graphs, few research efforts have been
devoted to the spatiotemporal case. To fill this gap, we propose a scalable
architecture that exploits an efficient encoding of both temporal and spatial
dynamics. In particular, we use a randomized recurrent neural network to embed
the history of the input time series into high-dimensional state
representations encompassing multi-scale temporal dynamics. Such
representations are then propagated along the spatial dimension using different
powers of the graph adjacency matrix to generate node embeddings characterized
by a rich pool of spatiotemporal features. The resulting node embeddings can be
efficiently pre-computed in an unsupervised manner, before being fed to a
feed-forward decoder that learns to map the multi-scale spatiotemporal
representations to predictions. The training procedure can then be parallelized
node-wise by sampling the node embeddings without breaking any dependency, thus
enabling scalability to large networks. Empirical results on relevant datasets
show that our approach achieves results competitive with the state of the art,
while dramatically reducing the computational burden.
Related papers
- Higher-order Spatio-temporal Physics-incorporated Graph Neural Network for Multivariate Time Series Imputation [9.450743095412896]
Missing values is an essential but challenging issue due to the complex latent-temporal correlation and dynamic nature of time series.
We propose a higher-ordertemporal physics-incorporated Graph Neural Networks (HSPGNN) to address this problem.
HSPGNN provides better dynamic analysis and explanation than traditional data-driven models.
arXiv Detail & Related papers (2024-05-16T16:35:43Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Using Time-Aware Graph Neural Networks to Predict Temporal Centralities in Dynamic Graphs [0.8057006406834466]
We study the application of De Bruijn Graph Neural Networks (DBGNN) to predict temporal path-based centralities in time series data.
We experimentally evaluate our approach in 13 temporal graphs from biological and social systems.
arXiv Detail & Related papers (2023-10-24T14:23:10Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - STGIN: A Spatial Temporal Graph-Informer Network for Long Sequence
Traffic Speed Forecasting [8.596556653895028]
This study proposes a new spatial-temporal neural network architecture to handle the long-term traffic parameters forecasting issue.
The attention mechanism potentially guarantees long-term prediction performance without significant information loss from distant inputs.
arXiv Detail & Related papers (2022-10-01T05:58:22Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Efficient-Dyn: Dynamic Graph Representation Learning via Event-based
Temporal Sparse Attention Network [2.0047096160313456]
Dynamic graph neural networks have received more and more attention from researchers.
We propose a novel dynamic graph neural network, Efficient-Dyn.
It adaptively encodes temporal information into a sequence of patches with an equal amount of temporal-topological structure.
arXiv Detail & Related papers (2022-01-04T23:52:24Z) - Spatio-Temporal Joint Graph Convolutional Networks for Traffic
Forecasting [75.10017445699532]
Recent have shifted their focus towards formulating traffic forecasting as atemporal graph modeling problem.
We propose a novel approach for accurate traffic forecasting on road networks over multiple future time steps.
arXiv Detail & Related papers (2021-11-25T08:45:14Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.