Graph Sequential Neural ODE Process for Link Prediction on Dynamic and
Sparse Graphs
- URL: http://arxiv.org/abs/2211.08568v1
- Date: Tue, 15 Nov 2022 23:21:02 GMT
- Title: Graph Sequential Neural ODE Process for Link Prediction on Dynamic and
Sparse Graphs
- Authors: Linhao Luo, Reza Haffari, Shirui Pan
- Abstract summary: Link prediction on dynamic graphs is an important task in graph mining.
Existing approaches based on dynamic graph neural networks (DGNNs) typically require a significant amount of historical data.
We propose a novel method based on the neural process, called Graph Sequential Neural ODE Process (GSNOP)
- Score: 33.294977897987685
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Link prediction on dynamic graphs is an important task in graph mining.
Existing approaches based on dynamic graph neural networks (DGNNs) typically
require a significant amount of historical data (interactions over time), which
is not always available in practice. The missing links over time, which is a
common phenomenon in graph data, further aggravates the issue and thus creates
extremely sparse and dynamic graphs. To address this problem, we propose a
novel method based on the neural process, called Graph Sequential Neural ODE
Process (GSNOP). Specifically, GSNOP combines the advantage of the neural
process and neural ordinary differential equation that models the link
prediction on dynamic graphs as a dynamic-changing stochastic process. By
defining a distribution over functions, GSNOP introduces the uncertainty into
the predictions, making it generalize to more situations instead of overfitting
to the sparse data. GSNOP is also agnostic to model structures that can be
integrated with any DGNN to consider the chronological and geometrical
information for link prediction. Extensive experiments on three dynamic graph
datasets show that GSNOP can significantly improve the performance of existing
DGNNs and outperform other neural process variants.
Related papers
- Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - Explaining Dynamic Graph Neural Networks via Relevance Back-propagation [8.035521056416242]
Graph Neural Networks (GNNs) have shown remarkable effectiveness in capturing abundant information in graph-structured data.
The black-box nature of GNNs hinders users from understanding and trusting the models, thus leading to difficulties in their applications.
We propose DGExplainer to provide reliable explanation on dynamic GNNs.
arXiv Detail & Related papers (2022-07-22T16:20:34Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Stochastic Graph Recurrent Neural Network [6.656993023468793]
We propose SGRNN, a novel neural architecture that applies latent variables to simultaneously capture evolution in node attributes and topology.
Specifically, deterministic states are separated from states in the iterative process to suppress mutual interference.
Experiments on real-world datasets demonstrate the effectiveness of the proposed model.
arXiv Detail & Related papers (2020-09-01T16:14:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.