Disentangled Dynamic Graph Deep Generation
- URL: http://arxiv.org/abs/2010.07276v2
- Date: Wed, 20 Jan 2021 03:58:21 GMT
- Title: Disentangled Dynamic Graph Deep Generation
- Authors: Wenbin Zhang, Liming Zhang, Dieter Pfoser and Liang Zhao
- Abstract summary: This paper proposes a novel framework of factorized deep generative models to achieve interpretable dynamic graph generation.
Various generative models are proposed to characterize conditional independence among node, edge, static, and dynamic factors.
Experiments on multiple datasets demonstrate the effectiveness of the proposed models.
- Score: 10.934180735890727
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep generative models for graphs have exhibited promising performance in
ever-increasing domains such as design of molecules (i.e, graph of atoms) and
structure prediction of proteins (i.e., graph of amino acids). Existing work
typically focuses on static rather than dynamic graphs, which are actually very
important in the applications such as protein folding, molecule reactions, and
human mobility. Extending existing deep generative models from static to
dynamic graphs is a challenging task, which requires to handle the
factorization of static and dynamic characteristics as well as mutual
interactions among node and edge patterns. Here, this paper proposes a novel
framework of factorized deep generative models to achieve interpretable dynamic
graph generation. Various generative models are proposed to characterize
conditional independence among node, edge, static, and dynamic factors. Then,
variational optimization strategies as well as dynamic graph decoders are
proposed based on newly designed factorized variational autoencoders and
recurrent graph deconvolutions. Extensive experiments on multiple datasets
demonstrate the effectiveness of the proposed models.
Related papers
- Dynamic and Textual Graph Generation Via Large-Scale LLM-based Agent Simulation [70.60461609393779]
GraphAgent-Generator (GAG) is a novel simulation-based framework for dynamic graph generation.
Our framework effectively replicates seven macro-level structural characteristics in established network science theories.
It supports generating graphs with up to nearly 100,000 nodes or 10 million edges, with a minimum speed-up of 90.4%.
arXiv Detail & Related papers (2024-10-13T12:57:08Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - Learning graph geometry and topology using dynamical systems based message-passing [21.571006438656323]
We introduce DYMAG: a message passing paradigm for GNNs built on the expressive power of graph-dynamics.
DYMAG makes use of complex graph dynamics based on the heat and wave equation as well as a more complex equation which admits chaotic solutions.
We demonstrate that DYMAG achieves superior performance in recovering the generating parameters of Erd"os-Renyi and block random graphs.
arXiv Detail & Related papers (2023-09-18T16:39:51Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Decoupled Graph Neural Networks for Large Dynamic Graphs [14.635923016087503]
We propose a decoupled graph neural network for large dynamic graphs.
We show that our algorithm achieves state-of-the-art performance in both kinds of dynamic graphs.
arXiv Detail & Related papers (2023-05-14T23:00:10Z) - Learning Dynamic Graph Embeddings with Neural Controlled Differential
Equations [21.936437653875245]
This paper focuses on representation learning for dynamic graphs with temporal interactions.
We propose a generic differential model for dynamic graphs that characterises the continuously dynamic evolution of node embedding trajectories.
Our framework exhibits several desirable characteristics, including the ability to express dynamics on evolving graphs without integration by segments.
arXiv Detail & Related papers (2023-02-22T12:59:38Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Time-aware Dynamic Graph Embedding for Asynchronous Structural Evolution [60.695162101159134]
Existing works merely view a dynamic graph as a sequence of changes.
We formulate dynamic graphs as temporal edge sequences associated with joining time of.
vertex and timespan of edges.
A time-aware Transformer is proposed to embed.
vertex' dynamic connections and ToEs into the learned.
vertex representations.
arXiv Detail & Related papers (2022-07-01T15:32:56Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Learning Attribute-Structure Co-Evolutions in Dynamic Graphs [28.848851822725933]
We present a novel framework called CoEvoGNN for modeling dynamic attributed graph sequence.
It preserves the impact of earlier graphs on the current graph by embedding generation through the sequence.
It has a temporal self-attention mechanism to model long-range dependencies in the evolution.
arXiv Detail & Related papers (2020-07-25T20:07:28Z) - EvoNet: A Neural Network for Predicting the Evolution of Dynamic Graphs [26.77596449192451]
We propose a model that predicts the evolution of dynamic graphs.
Specifically, we use a graph neural network along with a recurrent architecture to capture the temporal evolution patterns of dynamic graphs.
We evaluate the proposed model on several artificial datasets following common network evolving dynamics, as well as on real-world datasets.
arXiv Detail & Related papers (2020-03-02T12:59:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.