An Open Challenge for Inductive Link Prediction on Knowledge Graphs
- URL: http://arxiv.org/abs/2203.01520v1
- Date: Thu, 3 Mar 2022 05:24:09 GMT
- Title: An Open Challenge for Inductive Link Prediction on Knowledge Graphs
- Authors: Mikhail Galkin, Max Berrendorf, Charles Tapley Hoyt
- Abstract summary: An emerging trend in representation learning over knowledge graphs (KGs) moves beyond transductive link prediction tasks.
Despite the growing interest, there are not enough benchmarks for evaluating inductive representation learning methods.
We introduce ILPC 2022, a novel open challenge on KG inductive link prediction.
- Score: 0.7960322329952452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: An emerging trend in representation learning over knowledge graphs (KGs)
moves beyond transductive link prediction tasks over a fixed set of known
entities in favor of inductive tasks that imply training on one graph and
performing inference over a new graph with unseen entities. In inductive
setups, node features are often not available and training shallow entity
embedding matrices is meaningless as they cannot be used at inference time with
unseen entities. Despite the growing interest, there are not enough benchmarks
for evaluating inductive representation learning methods. In this work, we
introduce ILPC 2022, a novel open challenge on KG inductive link prediction. To
this end, we constructed two new datasets based on Wikidata with various sizes
of training and inference graphs that are much larger than existing inductive
benchmarks. We also provide two strong baselines leveraging recently proposed
inductive methods. We hope this challenge helps to streamline community efforts
in the inductive graph representation learning area. ILPC 2022 follows best
practices on evaluation fairness and reproducibility, and is available at
https://github.com/pykeen/ilpc2022.
Related papers
- Extending Transductive Knowledge Graph Embedding Models for Inductive
Logical Relational Inference [0.5439020425819]
This work bridges the gap between traditional transductive knowledge graph embedding approaches and more recent inductive relation prediction models.
We introduce a generalized form of harmonic extension which leverages representations learned through transductive embedding methods to infer representations of new entities introduced at inference time as in the inductive setting.
In experiments on a number of large-scale knowledge graph embedding benchmarks, we find that this approach for extending the functionality of transductive knowledge graph embedding models is competitive with--and in some scenarios outperforms--several state-of-the-art models derived explicitly for such inductive tasks.
arXiv Detail & Related papers (2023-09-07T15:24:18Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Graph Condensation for Inductive Node Representation Learning [59.76374128436873]
We propose mapping-aware graph condensation (MCond)
MCond integrates new nodes into the synthetic graph for inductive representation learning.
On the Reddit dataset, MCond achieves up to 121.5x inference speedup and 55.9x reduction in storage requirements.
arXiv Detail & Related papers (2023-07-29T12:11:14Z) - Towards Few-shot Inductive Link Prediction on Knowledge Graphs: A
Relational Anonymous Walk-guided Neural Process Approach [49.00753238429618]
Few-shot inductive link prediction on knowledge graphs aims to predict missing links for unseen entities with few-shot links observed.
Recent inductive methods utilize the sub-graphs around unseen entities to obtain the semantics and predict links inductively.
We propose a novel relational anonymous walk-guided neural process for few-shot inductive link prediction on knowledge graphs, denoted as RawNP.
arXiv Detail & Related papers (2023-06-26T12:02:32Z) - RAILD: Towards Leveraging Relation Features for Inductive Link
Prediction In Knowledge Graphs [1.5469452301122175]
Relation Aware Inductive Link preDiction (RAILD) is proposed for Knowledge Graph completion.
RAILD learns representations for both unseen entities and unseen relations.
arXiv Detail & Related papers (2022-11-21T12:35:30Z) - Inductive Logical Query Answering in Knowledge Graphs [30.220508024471595]
We study the inductive query answering task where inference is performed on a graph containing new entities with queries over both seen and unseen entities.
We devise two mechanisms leveraging inductive node and relational structure representations powered by graph neural networks (GNNs)
Experimentally, we show that inductive models are able to perform logical reasoning at inference time over unseen nodes generalizing to graphs up to 500% larger than training ones.
arXiv Detail & Related papers (2022-10-13T03:53:34Z) - Improving Inductive Link Prediction Using Hyper-Relational Facts [15.820005235333882]
We study the benefits of employing hyper-relational KGs on a wide range of semi- and fully inductive link prediction tasks powered by graph neural networks.
Our experiments show that qualifiers over typed edges can lead to performance improvements of 6% of absolute gains.
arXiv Detail & Related papers (2021-07-10T19:16:03Z) - Uniting Heterogeneity, Inductiveness, and Efficiency for Graph
Representation Learning [68.97378785686723]
graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs.
A majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs.
We propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes.
arXiv Detail & Related papers (2021-04-04T23:31:39Z) - Iterative Graph Self-Distillation [161.04351580382078]
We propose a novel unsupervised graph learning paradigm called Iterative Graph Self-Distillation (IGSD)
IGSD iteratively performs the teacher-student distillation with graph augmentations.
We show that we achieve significant and consistent performance gain on various graph datasets in both unsupervised and semi-supervised settings.
arXiv Detail & Related papers (2020-10-23T18:37:06Z) - Learning to Extrapolate Knowledge: Transductive Few-shot Out-of-Graph
Link Prediction [69.1473775184952]
We introduce a realistic problem of few-shot out-of-graph link prediction.
We tackle this problem with a novel transductive meta-learning framework.
We validate our model on multiple benchmark datasets for knowledge graph completion and drug-drug interaction prediction.
arXiv Detail & Related papers (2020-06-11T17:42:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.