Better Together: Enhancing Generative Knowledge Graph Completion with
Language Models and Neighborhood Information
- URL: http://arxiv.org/abs/2311.01326v1
- Date: Thu, 2 Nov 2023 15:38:39 GMT
- Title: Better Together: Enhancing Generative Knowledge Graph Completion with
Language Models and Neighborhood Information
- Authors: Alla Chepurova, Aydar Bulatov, Yuri Kuratov, Mikhail Burtsev
- Abstract summary: Real-world Knowledge Graphs (KGs) often suffer from incompleteness, which limits their potential performance.
Traditional Knowledge Graph Completion (KGC) methods are computationally intensive and impractical for large-scale KGs.
We propose to include node neighborhoods as additional information to improve KGC methods based on language models.
- Score: 5.5577202588167625
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world Knowledge Graphs (KGs) often suffer from incompleteness, which
limits their potential performance. Knowledge Graph Completion (KGC) techniques
aim to address this issue. However, traditional KGC methods are computationally
intensive and impractical for large-scale KGs, necessitating the learning of
dense node embeddings and computing pairwise distances. Generative
transformer-based language models (e.g., T5 and recent KGT5) offer a promising
solution as they can predict the tail nodes directly. In this study, we propose
to include node neighborhoods as additional information to improve KGC methods
based on language models. We examine the effects of this imputation and show
that, on both inductive and transductive Wikidata subsets, our method
outperforms KGT5 and conventional KGC approaches. We also provide an extensive
analysis of the impact of neighborhood on model prediction and show its
importance. Furthermore, we point the way to significantly improve KGC through
more effective neighborhood selection.
Related papers
- Multi-perspective Improvement of Knowledge Graph Completion with Large
Language Models [95.31941227776711]
We propose MPIKGC to compensate for the deficiency of contextualized knowledge and improve KGC by querying large language models (LLMs)
We conducted extensive evaluation of our framework based on four description-based KGC models and four datasets, for both link prediction and triplet classification tasks.
arXiv Detail & Related papers (2024-03-04T12:16:15Z) - Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - Graph Condensation: A Survey [49.41718583061147]
The rapid growth of graph data poses significant challenges in storage, transmission, and particularly the training of graph neural networks (GNNs)
To address these challenges, graph condensation (GC) has emerged as an innovative solution.
GC focuses on a compact yet highly representative graph, enabling GNNs trained on it to achieve performance comparable to those trained on the original large graph.
arXiv Detail & Related papers (2024-01-22T06:47:00Z) - Path-based Explanation for Knowledge Graph Completion [17.541247786437484]
Proper explanations for the results of GNN-based Knowledge Graph Completion models increase model transparency.
Existing practices for explaining KGC tasks rely on instance/subgraph-based approaches.
We propose Power-Link, the first path-based KGC explainer that explores GNN-based models.
arXiv Detail & Related papers (2024-01-04T14:19:37Z) - Enhancing Text-based Knowledge Graph Completion with Zero-Shot Large Language Models: A Focus on Semantic Enhancement [8.472388165833292]
We introduce a framework termed constrained prompts for KGC (CP-KGC)
This framework designs prompts that adapt to different datasets to enhance semantic richness.
This study extends the performance limits of existing models and promotes further integration of KGC with large language models.
arXiv Detail & Related papers (2023-10-12T12:31:23Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - GreenKGC: A Lightweight Knowledge Graph Completion Method [32.528770408502396]
GreenKGC aims to discover missing relationships between entities in knowledge graphs.
It consists of three modules: representation learning, feature pruning, and decision learning.
In low dimensions, GreenKGC can outperform SOTA methods in most datasets.
arXiv Detail & Related papers (2022-08-19T03:33:45Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Rethinking Graph Convolutional Networks in Knowledge Graph Completion [83.25075514036183]
Graph convolutional networks (GCNs) have been increasingly popular in knowledge graph completion (KGC)
In this paper, we build upon representative GCN-based KGC models and introduce variants to find which factor of GCNs is critical in KGC.
We propose a simple yet effective framework named LTE-KGE, which equips existing KGE models with linearly transformed entity embeddings.
arXiv Detail & Related papers (2022-02-08T11:36:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.