Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting
- URL: http://arxiv.org/abs/2305.09703v1
- Date: Tue, 16 May 2023 11:38:19 GMT
- Title: Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting
- Authors: Guojun Liang, Prayag Tiwari, S{\l}awomir Nowaczyk, Stefan Byttner,
Fernando Alonso-Fernandez
- Abstract summary: We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
- Score: 60.03169701753824
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs), especially dynamic GNNs, have become a research
hotspot in spatio-temporal forecasting problems. While many dynamic graph
construction methods have been developed, relatively few of them explore the
causal relationship between neighbour nodes. Thus, the resulting models lack
strong explainability for the causal relationship between the neighbour nodes
of the dynamically generated graphs, which can easily lead to a risk in
subsequent decisions. Moreover, few of them consider the uncertainty and noise
of dynamic graphs based on the time series datasets, which are ubiquitous in
real-world graph structure networks. In this paper, we propose a novel Dynamic
Diffusion-Variational Graph Neural Network (DVGNN) for spatio-temporal
forecasting. For dynamic graph construction, an unsupervised generative model
is devised. Two layers of graph convolutional network (GCN) are applied to
calculate the posterior distribution of the latent node embeddings in the
encoder stage. Then, a diffusion model is used to infer the dynamic link
probability and reconstruct causal graphs in the decoder stage adaptively. The
new loss function is derived theoretically, and the reparameterization trick is
adopted in estimating the probability distribution of the dynamic graphs by
Evidence Lower Bound during the backpropagation period. After obtaining the
generated graphs, dynamic GCN and temporal attention are applied to predict
future states. Experiments are conducted on four real-world datasets of
different graph structures in different domains. The results demonstrate that
the proposed DVGNN model outperforms state-of-the-art approaches and achieves
outstanding Root Mean Squared Error result while exhibiting higher robustness.
Also, by F1-score and probability distribution analysis, we demonstrate that
DVGNN better reflects the causal relationship and uncertainty of dynamic
graphs.
Related papers
- Inference of Sequential Patterns for Neural Message Passing in Temporal Graphs [0.6562256987706128]
HYPA-DBGNN is a novel two-step approach that combines the inference of anomalous sequential patterns in time series data on graphs.
Our method leverages hypergeometric graph ensembles to identify anomalous edges within both first- and higher-order De Bruijn graphs.
Our work is the first to introduce statistically informed GNNs that leverage temporal and causal sequence anomalies.
arXiv Detail & Related papers (2024-06-24T11:41:12Z) - On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks [14.779420473274737]
We show that our model, called TDE-GNN, can capture a wide range of temporal dynamics that go beyond typical first or second-order methods.
We demonstrate the benefit of learning the temporal dependencies using our method rather than using pre-defined temporal dynamics on several graph benchmarks.
arXiv Detail & Related papers (2024-01-20T01:12:57Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Temporal Graph Neural Networks for Irregular Data [14.653008985229615]
TGNN4I model is designed to handle both irregular time steps and partial observations of the graph.
Time-continuous dynamics enables the model to make predictions at arbitrary time steps.
Experiments on simulated data and real-world data from traffic and climate modeling validate the usefulness of both the graph structure and time-continuous dynamics.
arXiv Detail & Related papers (2023-02-16T16:47:55Z) - Graph Sequential Neural ODE Process for Link Prediction on Dynamic and
Sparse Graphs [33.294977897987685]
Link prediction on dynamic graphs is an important task in graph mining.
Existing approaches based on dynamic graph neural networks (DGNNs) typically require a significant amount of historical data.
We propose a novel method based on the neural process, called Graph Sequential Neural ODE Process (GSNOP)
arXiv Detail & Related papers (2022-11-15T23:21:02Z) - Explaining Dynamic Graph Neural Networks via Relevance Back-propagation [8.035521056416242]
Graph Neural Networks (GNNs) have shown remarkable effectiveness in capturing abundant information in graph-structured data.
The black-box nature of GNNs hinders users from understanding and trusting the models, thus leading to difficulties in their applications.
We propose DGExplainer to provide reliable explanation on dynamic GNNs.
arXiv Detail & Related papers (2022-07-22T16:20:34Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.