An Explainer for Temporal Graph Neural Networks
- URL: http://arxiv.org/abs/2209.00807v1
- Date: Fri, 2 Sep 2022 04:12:40 GMT
- Title: An Explainer for Temporal Graph Neural Networks
- Authors: Wenchong He, Minh N. Vu, Zhe Jiang, My T. Thai
- Abstract summary: Temporal graph neural networks (TGNNs) have been widely used for modeling time-evolving graph-related tasks.
We propose a novel explainer framework for TGNN models.
- Score: 27.393641343203363
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal graph neural networks (TGNNs) have been widely used for modeling
time-evolving graph-related tasks due to their ability to capture both graph
topology dependency and non-linear temporal dynamic. The explanation of TGNNs
is of vital importance for a transparent and trustworthy model. However, the
complex topology structure and temporal dependency make explaining TGNN models
very challenging. In this paper, we propose a novel explainer framework for
TGNN models. Given a time series on a graph to be explained, the framework can
identify dominant explanations in the form of a probabilistic graphical model
in a time period. Case studies on the transportation domain demonstrate that
the proposed approach can discover dynamic dependency structures in a road
network for a time period.
Related papers
- On The Temporal Domain of Differential Equation Inspired Graph Neural
Networks [14.779420473274737]
We show that our model, called TDE-GNN, can capture a wide range of temporal dynamics that go beyond typical first or second-order methods.
We demonstrate the benefit of learning the temporal dependencies using our method rather than using pre-defined temporal dynamics on several graph benchmarks.
arXiv Detail & Related papers (2024-01-20T01:12:57Z) - DyExplainer: Explainable Dynamic Graph Neural Networks [37.16783248212211]
We present DyExplainer, a novel approach to explaining dynamic Graph Neural Networks (GNNs) on the fly.
DyExplainer trains a dynamic GNN backbone to extract representations of the graph at each snapshot.
We also augment our approach with contrastive learning techniques to provide priori-guided regularization.
arXiv Detail & Related papers (2023-10-25T05:26:33Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Graph Neural Networks for temporal graphs: State of the art, open
challenges, and opportunities [15.51428011794213]
Graph Neural Networks (GNNs) have become the leading paradigm for learning on (static) graph-structured data.
Recent years, GNN-based models for temporal graphs have emerged as a promising area of research to extend the capabilities of GNNs.
We provide the first comprehensive overview of the current state-of-the-art of temporal GNN, introducing a rigorous formalization of learning settings and tasks.
We conclude the survey with a discussion of the most relevant open challenges for the field, from both research and application perspectives.
arXiv Detail & Related papers (2023-02-02T11:12:51Z) - Graph-Time Convolutional Neural Networks: Architecture and Theoretical
Analysis [12.995632804090198]
We introduce Graph-Time Convolutional Neural Networks (GTCNNs) as principled architecture to aid learning.
The approach can work with any type of product graph and we also introduce a parametric graph to learn also the producttemporal coupling.
Extensive numerical results on benchmark corroborate our findings and show the GTCNN compares favorably with state-of-the-art solutions.
arXiv Detail & Related papers (2022-06-30T10:20:52Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Implicit Graph Neural Networks [46.0589136729616]
We propose a graph learning framework called Implicit Graph Neural Networks (IGNN)
IGNNs consistently capture long-range dependencies and outperform state-of-the-art GNN models.
arXiv Detail & Related papers (2020-09-14T06:04:55Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Gated Graph Recurrent Neural Networks [176.3960927323358]
We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework for graph processes.
To address the problem of vanishing gradients, we put forward GRNNs with three different gating mechanisms: time, node and edge gates.
The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.
arXiv Detail & Related papers (2020-02-03T22:35:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.