RAILD: Towards Leveraging Relation Features for Inductive Link
Prediction In Knowledge Graphs
- URL: http://arxiv.org/abs/2211.11407v1
- Date: Mon, 21 Nov 2022 12:35:30 GMT
- Title: RAILD: Towards Leveraging Relation Features for Inductive Link
Prediction In Knowledge Graphs
- Authors: Genet Asefa Gesese, Harald Sack, Mehwish Alam
- Abstract summary: Relation Aware Inductive Link preDiction (RAILD) is proposed for Knowledge Graph completion.
RAILD learns representations for both unseen entities and unseen relations.
- Score: 1.5469452301122175
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Due to the open world assumption, Knowledge Graphs (KGs) are never complete.
In order to address this issue, various Link Prediction (LP) methods are
proposed so far. Some of these methods are inductive LP models which are
capable of learning representations for entities not seen during training.
However, to the best of our knowledge, none of the existing inductive LP models
focus on learning representations for unseen relations. In this work, a novel
Relation Aware Inductive Link preDiction (RAILD) is proposed for KG completion
which learns representations for both unseen entities and unseen relations. In
addition to leveraging textual literals associated with both entities and
relations by employing language models, RAILD also introduces a novel
graph-based approach to generate features for relations. Experiments are
conducted with different existing and newly created challenging benchmark
datasets and the results indicate that RAILD leads to performance improvement
over the state-of-the-art models. Moreover, since there are no existing
inductive LP models which learn representations for unseen relations, we have
created our own baselines and the results obtained with RAILD also outperform
these baselines.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Introducing Diminutive Causal Structure into Graph Representation Learning [19.132025125620274]
We introduce a novel method that enables Graph Neural Networks (GNNs) to glean insights from specialized diminutive causal structures.
Our method specifically extracts causal knowledge from the model representation of these diminutive causal structures.
arXiv Detail & Related papers (2024-06-13T00:18:20Z) - zrLLM: Zero-Shot Relational Learning on Temporal Knowledge Graphs with Large Language Models [33.10218179341504]
We use large language models to generate relation representations for embedding-based TKGF methods.
We show that our approach helps TKGF models to achieve much better performance in forecasting the facts with previously unseen relations.
arXiv Detail & Related papers (2023-11-15T21:25:15Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Towards Few-shot Inductive Link Prediction on Knowledge Graphs: A
Relational Anonymous Walk-guided Neural Process Approach [49.00753238429618]
Few-shot inductive link prediction on knowledge graphs aims to predict missing links for unseen entities with few-shot links observed.
Recent inductive methods utilize the sub-graphs around unseen entities to obtain the semantics and predict links inductively.
We propose a novel relational anonymous walk-guided neural process for few-shot inductive link prediction on knowledge graphs, denoted as RawNP.
arXiv Detail & Related papers (2023-06-26T12:02:32Z) - Message Intercommunication for Inductive Relation Reasoning [49.731293143079455]
We develop a novel inductive relation reasoning model called MINES.
We introduce a Message Intercommunication mechanism on the Neighbor-Enhanced Subgraph.
Our experiments show that MINES outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2023-05-23T13:51:46Z) - Knowledge Graph Completion with Counterfactual Augmentation [23.20561746976504]
We introduce a counterfactual question: "would the relation still exist if the neighborhood of entities became different from observation?"
With a carefully designed instantiation of a causal model on the knowledge graph, we generate the counterfactual relations to answer the question.
We incorporate the created counterfactual relations with the GNN-based framework on KGs to augment their learning of entity pair representations.
arXiv Detail & Related papers (2023-02-25T14:08:15Z) - Schema-aware Reference as Prompt Improves Data-Efficient Knowledge Graph
Construction [57.854498238624366]
We propose a retrieval-augmented approach, which retrieves schema-aware Reference As Prompt (RAP) for data-efficient knowledge graph construction.
RAP can dynamically leverage schema and knowledge inherited from human-annotated and weak-supervised data as a prompt for each sample.
arXiv Detail & Related papers (2022-10-19T16:40:28Z) - Relational Message Passing for Fully Inductive Knowledge Graph
Completion [37.29833710603933]
In knowledge graph completion (KGC), predicting triples involving emerging entities and/or relations, which are unseen when KG embeddings are learned, has become a critical challenge.
Subgraph reasoning with message passing is a promising and popular solution.
We propose a new method named RMPI which uses a novel Message Passing network for fully available KGC.
arXiv Detail & Related papers (2022-10-08T10:35:52Z) - Exploring the Limits of Few-Shot Link Prediction in Knowledge Graphs [49.6661602019124]
We study a spectrum of models derived by generalizing the current state of the art for few-shot link prediction.
We find that a simple zero-shot baseline - which ignores any relation-specific information - achieves surprisingly strong performance.
Experiments on carefully crafted synthetic datasets show that having only a few examples of a relation fundamentally limits models from using fine-grained structural information.
arXiv Detail & Related papers (2021-02-05T21:04:31Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.