Mixture of Link Predictors
- URL: http://arxiv.org/abs/2402.08583v1
- Date: Tue, 13 Feb 2024 16:36:50 GMT
- Title: Mixture of Link Predictors
- Authors: Li Ma, Haoyu Han, Juanhui Li, Harry Shomer, Hui Liu, Xiaofeng Gao,
Jiliang Tang
- Abstract summary: Link prediction aims to forecast unseen connections in graphs.
Heuristic methods, leveraging a range of different pairwise measures, often rival the performance of vanilla Graph Neural Networks (GNNs)
- Score: 40.32089688353189
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Link prediction, which aims to forecast unseen connections in graphs, is a
fundamental task in graph machine learning. Heuristic methods, leveraging a
range of different pairwise measures such as common neighbors and shortest
paths, often rival the performance of vanilla Graph Neural Networks (GNNs).
Therefore, recent advancements in GNNs for link prediction (GNN4LP) have
primarily focused on integrating one or a few types of pairwise information. In
this work, we reveal that different node pairs within the same dataset
necessitate varied pairwise information for accurate prediction and models that
only apply the same pairwise information uniformly could achieve suboptimal
performance. As a result, we propose a simple mixture of experts model Link-MoE
for link prediction. Link-MoE utilizes various GNNs as experts and
strategically selects the appropriate expert for each node pair based on
various types of pairwise information. Experimental results across diverse
real-world datasets demonstrate substantial performance improvement from
Link-MoE. Notably, Link-MoE achieves a relative improvement of 18.82\% on the
MRR metric for the Pubmed dataset and 10.8\% on the Hits@100 metric for the
ogbl-ppa dataset, compared to the best baselines.
Related papers
- DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts [70.21017141742763]
Graph neural networks (GNNs) are gaining popularity for processing graph-structured data.
Existing methods generally use a fixed number of GNN layers to generate representations for all graphs.
We propose the depth adaptive mixture of expert (DA-MoE) method, which incorporates two main improvements to GNN.
arXiv Detail & Related papers (2024-11-05T11:46:27Z) - PROXI: Challenging the GNNs for Link Prediction [3.8233569758620063]
We introduce PROXI, which leverages proximity information of node pairs in both graph and attribute spaces.
Standard machine learning (ML) models perform competitively, even outperforming cutting-edge GNN models.
We show that augmenting traditional GNNs with PROXI significantly boosts their link prediction performance.
arXiv Detail & Related papers (2024-10-02T17:57:38Z) - Towards Better Graph-based Cross-document Relation Extraction via Non-bridge Entity Enhancement and Prediction Debiasing [30.204313638661255]
Cross-document Relation Extraction aims to predict the relation between target entities located in different documents.
We propose a novel graph-based cross-document RE model with non-bridge entity enhancement and prediction debiasing.
arXiv Detail & Related papers (2024-06-24T11:08:28Z) - Revisiting Link Prediction: A Data Perspective [59.296773787387224]
Link prediction, a fundamental task on graphs, has proven indispensable in various applications, e.g., friend recommendation, protein analysis, and drug interaction prediction.
Evidence in existing literature underscores the absence of a universally best algorithm suitable for all datasets.
We recognize three fundamental factors critical to link prediction: local structural proximity, global structural proximity, and feature proximity.
arXiv Detail & Related papers (2023-10-01T21:09:59Z) - A Simple and Scalable Graph Neural Network for Large Directed Graphs [11.792826520370774]
We investigate various combinations of node representations and edge direction awareness within an input graph.
In response, we propose a simple yet holistic classification method A2DUG.
We demonstrate that A2DUG stably performs well on various datasets and improves the accuracy up to 11.29 compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-06-14T06:24:58Z) - Revisiting Neighborhood-based Link Prediction for Collaborative
Filtering [3.7403495150710384]
Collaborative filtering is one of the most successful and fundamental techniques in recommendation systems.
We propose a new linkage (connectivity) score for bipartite graphs, generalizing multiple standard link prediction methods.
We demonstrate our approach significantly outperforms existing state-of-the-art GNN-based CF approaches on four widely used benchmarks.
arXiv Detail & Related papers (2022-03-29T17:48:05Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Node Feature Extraction by Self-Supervised Multi-scale Neighborhood
Prediction [123.20238648121445]
We propose a new self-supervised learning framework, Graph Information Aided Node feature exTraction (GIANT)
GIANT makes use of the eXtreme Multi-label Classification (XMC) formalism, which is crucial for fine-tuning the language model based on graph information.
We demonstrate the superior performance of GIANT over the standard GNN pipeline on Open Graph Benchmark datasets.
arXiv Detail & Related papers (2021-10-29T19:55:12Z) - Meta-Aggregator: Learning to Aggregate for 1-bit Graph Neural Networks [127.32203532517953]
We develop a vanilla 1-bit framework that binarizes both the GNN parameters and the graph features.
Despite the lightweight architecture, we observed that this vanilla framework suffered from insufficient discriminative power in distinguishing graph topologies.
This discovery motivates us to devise meta aggregators to improve the expressive power of vanilla binarized GNNs.
arXiv Detail & Related papers (2021-09-27T08:50:37Z) - MHNF: Multi-hop Heterogeneous Neighborhood information Fusion graph
representation learning [0.0]
We propose a Multi-hop Heterogeneous Neighborhood information Fusion graph representation learning method (MHNF)
We first propose a hybrid metapath autonomous extraction model to efficiently extract multi-hop hybrid neighbors.
Then, we propose a hop-level heterogeneous Information aggregation model, which selectively aggregates different-hop neighborhood information.
Finally, a hierarchical semantic attention fusion model (HSAF) is proposed, which can efficiently integrate different-hop and different-path neighborhood information respectively.
arXiv Detail & Related papers (2021-06-17T07:51:45Z) - Breaking the Limit of Graph Neural Networks by Improving the
Assortativity of Graphs with Local Mixing Patterns [19.346133577539394]
Graph neural networks (GNNs) have achieved tremendous success on multiple graph-based learning tasks.
We focus on transforming the input graph into a computation graph which contains both proximity and structural information.
We show that adaptively choosing between structure and proximity leads to improved performance under diverse mixing.
arXiv Detail & Related papers (2021-06-11T19:18:34Z) - Deepened Graph Auto-Encoders Help Stabilize and Enhance Link Prediction [11.927046591097623]
Link prediction is a relatively under-studied graph learning task, with current state-of-the-art models based on one- or two-layers of shallow graph auto-encoder (GAE) architectures.
In this paper, we focus on addressing a limitation of current methods for link prediction, which can only use shallow GAEs and variational GAEs.
Our proposed methods innovatively incorporate standard auto-encoders (AEs) into the architectures of GAEs, where standard AEs are leveraged to learn essential, low-dimensional representations via seamlessly integrating the adjacency information and node features
arXiv Detail & Related papers (2021-03-21T14:43:10Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.