Efficient Neural Common Neighbor for Temporal Graph Link Prediction
- URL: http://arxiv.org/abs/2406.07926v1
- Date: Wed, 12 Jun 2024 06:45:03 GMT
- Title: Efficient Neural Common Neighbor for Temporal Graph Link Prediction
- Authors: Xiaohui Zhang, Yanbo Wang, Xiyuan Wang, Muhan Zhang,
- Abstract summary: We propose TNCN, a temporal version of Neural Common Neighbor (NCN) for link prediction in temporal graphs.
TNCN dynamically updates a temporal neighbor dictionary for each node, and utilizes multi-hop common neighbors between the source and target node to learn a more effective pairwise representation.
We validate our model on five large-scale real-world datasets, and find that it achieves new state-of-the-art performance on three of them.
- Score: 32.41660611941389
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Temporal graphs are ubiquitous in real-world scenarios, such as social network, trade and transportation. Predicting dynamic links between nodes in a temporal graph is of vital importance. Traditional methods usually leverage the temporal neighborhood of interaction history to generate node embeddings first and then aggregate the source and target node embeddings to predict the link. However, such methods focus on learning individual node representations, but overlook the pairwise representation learning nature of link prediction and fail to capture the important pairwise features of links such as common neighbors (CN). Motivated by the success of Neural Common Neighbor (NCN) for static graph link prediction, we propose TNCN, a temporal version of NCN for link prediction in temporal graphs. TNCN dynamically updates a temporal neighbor dictionary for each node, and utilizes multi-hop common neighbors between the source and target node to learn a more effective pairwise representation. We validate our model on five large-scale real-world datasets from the Temporal Graph Benchmark (TGB), and find that it achieves new state-of-the-art performance on three of them. Additionally, TNCN demonstrates excellent scalability on large datasets, outperforming popular GNN baselines by up to 6.4 times in speed. Our code is available at https: //github.com/GraphPKU/TNCN.
Related papers
- Commute Graph Neural Networks [7.143879014059894]
We introduce Commute Graph Neural Networks (CGNN), an approach that seamlessly integrates node-wise commute time into the message passing scheme.
CGNN is an efficient method for computing commute time using a newly formulated digraph Laplacian.
It enables CGNN to directly capture the mutual, asymmetric relationships in digraphs.
arXiv Detail & Related papers (2024-06-30T10:53:40Z) - Edge Conditional Node Update Graph Neural Network for Multi-variate Time
Series Anomaly Detection [35.299242563565315]
We introduce the Edge Conditional Node-update Graph Neural Network (ECNU-GNN)
Our model, equipped with an edge conditional node update module, dynamically transforms source node representations based on connected edges to represent target nodes aptly.
We validate performance on three real-world datasets: SWaT, WADI, and PSM.
arXiv Detail & Related papers (2024-01-25T00:47:44Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Node-wise Localization of Graph Neural Networks [52.04194209002702]
Graph neural networks (GNNs) emerge as a powerful family of representation learning models on graphs.
We propose a node-wise localization of GNNs by accounting for both global and local aspects of the graph.
We conduct extensive experiments on four benchmark graphs, and consistently obtain promising performance surpassing the state-of-the-art GNNs.
arXiv Detail & Related papers (2021-10-27T10:02:03Z) - Missing Data Estimation in Temporal Multilayer Position-aware Graph
Neural Network (TMP-GNN) [5.936402320555635]
Temporal Multilayered Position-aware Graph Neural Network (TMP-GNN) is a node embedding approach for dynamic graph.
We evaluate the performance of TMP-GNN on two different representations of temporal multilayered graphs.
We incorporate TMP-GNN into a deep learning framework to estimate missing data and compare the performance with their corresponding competent GNNs.
arXiv Detail & Related papers (2021-08-07T08:32:40Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.