EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph
Learning
- URL: http://arxiv.org/abs/2303.12341v1
- Date: Wed, 22 Mar 2023 06:35:08 GMT
- Title: EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph
Learning
- Authors: Chao Chen, Haoyu Geng, Nianzu Yang, Xiaokang Yang and Junchi Yan
- Abstract summary: This paper aims to design an easy-to-use pipeline (termed as EasyDGL) composed of three key modules with both strong ability fitting and interpretability.
EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.
- Score: 114.72818205974285
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dynamic graphs arise in various real-world applications, and it is often
welcomed to model the dynamics directly in continuous time domain for its
flexibility. This paper aims to design an easy-to-use pipeline (termed as
EasyDGL which is also due to its implementation by DGL toolkit) composed of
three key modules with both strong fitting ability and interpretability.
Specifically the proposed pipeline which involves encoding, training and
interpreting: i) a temporal point process (TPP) modulated attention
architecture to endow the continuous-time resolution with the coupled
spatiotemporal dynamics of the observed graph with edge-addition events; ii) a
principled loss composed of task-agnostic TPP posterior maximization based on
observed events on the graph, and a task-aware loss with a masking strategy
over dynamic graph, where the covered tasks include dynamic link prediction,
dynamic node classification and node traffic forecasting; iii) interpretation
of the model outputs (e.g., representations and predictions) with scalable
perturbation-based quantitative analysis in the graph Fourier domain, which
could more comprehensively reflect the behavior of the learned model. Extensive
experimental results on public benchmarks show the superior performance of our
EasyDGL for time-conditioned predictive tasks, and in particular demonstrate
that EasyDGL can effectively quantify the predictive power of frequency content
that a model learn from the evolving graph data.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Temporal Graph ODEs for Irregularly-Sampled Time Series [32.68671699403658]
We introduce the Temporal Graph Ordinary Differential Equation (TG-ODE) framework, which learns both the temporal and spatial dynamics from graph streams where the intervals between observations are not regularly spaced.
We empirically validate the proposed approach on several graph benchmarks, showing that TG-ODE can achieve state-of-the-art performance in irregular graph stream tasks.
arXiv Detail & Related papers (2024-04-30T12:43:11Z) - TimeGraphs: Graph-based Temporal Reasoning [64.18083371645956]
TimeGraphs is a novel approach that characterizes dynamic interactions as a hierarchical temporal graph.
Our approach models the interactions using a compact graph-based representation, enabling adaptive reasoning across diverse time scales.
We evaluate TimeGraphs on multiple datasets with complex, dynamic agent interactions, including a football simulator, the Resistance game, and the MOMA human activity dataset.
arXiv Detail & Related papers (2024-01-06T06:26:49Z) - Backbone-based Dynamic Graph Spatio-Temporal Network for Epidemic
Forecasting [3.382729969842304]
Accurate epidemic forecasting is a critical task in controlling disease transmission.
Many deep learning-based models focus only on static or dynamic graphs when constructing spatial information.
We propose a novel model called Backbone-based Dynamic Graph Spatio-Temporal Network (BDGSTN)
arXiv Detail & Related papers (2023-12-01T10:34:03Z) - Structure-reinforced Transformer for Dynamic Graph Representation Learning with Edge Temporal States [8.577434144370004]
We introduce a novel dynamic graph representation learning framework namely Recurrent Structure-reinforced Graph Transformer (RSGT)
RSGT initially models the temporal status of edges explicitly by utilizing different edge types and weights based on the differences between any two consecutive snapshots.
A structure-reinforced graph transformer is proposed to capture temporal node representations that encoding both the graph topological structure and evolving dynamics.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Dynamic Graph Representation Learning via Graph Transformer Networks [41.570839291138114]
We propose a Transformer-based dynamic graph learning method named Dynamic Graph Transformer (DGT)
DGT has spatial-temporal encoding to effectively learn graph topology and capture implicit links.
We show that DGT presents superior performance compared with several state-of-the-art baselines.
arXiv Detail & Related papers (2021-11-19T21:44:23Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.