KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation
- URL: http://arxiv.org/abs/2309.14770v2
- Date: Sat, 3 Aug 2024 13:34:24 GMT
- Title: KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation
- Authors: Haotian Li, Bin Yu, Yuliang Wei, Kai Wang, Richard Yi Da Xu, Bailing Wang,
- Abstract summary: We use large language models to generate coherent descriptions, bridging the semantic gap between queries and answers.
We also utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC.
Our approach achieves a 4.2% improvement in Hit@1 on WN18RR and a 3.4% improvement in Hit@3 on FB15k-237, demonstrating superior performance.
- Score: 19.31783654838732
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graph completion (KGC) revolves around populating missing triples in a knowledge graph using available information. Text-based methods, which depend on textual descriptions of triples, often encounter difficulties when these descriptions lack sufficient information for accurate prediction-an issue inherent to the datasets and not easily resolved through modeling alone. To address this and ensure data consistency, we first use large language models (LLMs) to generate coherent descriptions, bridging the semantic gap between queries and answers. Secondly, we utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC. Additionally, we employ the label information inherent in knowledge graphs (KGs) to enhance the existing contrastive framework, making it fully supervised. These efforts have led to significant performance improvements on the WN18RR and FB15k-237 datasets. According to standard evaluation metrics, our approach achieves a 4.2% improvement in Hit@1 on WN18RR and a 3.4% improvement in Hit@3 on FB15k-237, demonstrating superior performance.
Related papers
- MUSE: Integrating Multi-Knowledge for Knowledge Graph Completion [0.0]
Knowledge Graph Completion (KGC) aims to predict the missing [relation] part (head entity)--[relation]->(tail entity) triplet.
Most existing KGC methods focus on single features (e.g., relation types) or sub-graph aggregation.
We propose a knowledge-aware reasoning model (MUSE) which designs a novel multi-knowledge representation learning mechanism for missing relation prediction.
arXiv Detail & Related papers (2024-09-26T04:48:20Z) - Enhancing Heterogeneous Knowledge Graph Completion with a Novel GAT-based Approach [3.8357926394952306]
We propose a novel GAT-based knowledge graph completion method for Heterogeneous knowledge graphs.
GATH incorporates two separate attention network modules that work synergistically to predict the missing entities.
Our model improves performance by 5.2% and 5.2% on the FB15K-237 dataset and by 4.5% and 14.6% on the WN18RR dataset.
arXiv Detail & Related papers (2024-08-05T13:28:51Z) - Challenging the Myth of Graph Collaborative Filtering: a Reasoned and Reproducibility-driven Analysis [50.972595036856035]
We present a code that successfully replicates results from six popular and recent graph recommendation models.
We compare these graph models with traditional collaborative filtering models that historically performed well in offline evaluations.
By investigating the information flow from users' neighborhoods, we aim to identify which models are influenced by intrinsic features in the dataset structure.
arXiv Detail & Related papers (2023-08-01T09:31:44Z) - Knowledge Graph Refinement based on Triplet BERT-Networks [0.0]
This paper adopts a transformer-based triplet network creating an embedding space that clusters the information about an entity or relation in the Knowledge Graph.
It creates textual sequences from facts and fine-tunes a triplet network of pre-trained transformer-based language models.
We show that GilBERT achieves better or comparable results to the state-of-the-art performance on these two refinement tasks.
arXiv Detail & Related papers (2022-11-18T19:01:21Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - A Simple But Powerful Graph Encoder for Temporal Knowledge Graph
Completion [13.047205680129094]
We propose a simple but powerful graph encoder TARGCN for temporal knowledge graphs (TKGs)
Our model can achieve a more than 42% relative improvement on GDELT dataset compared with the state-of-the-art model.
It outperforms the strongest baseline on ICEWS05-15 dataset with around 18.5% fewer parameters.
arXiv Detail & Related papers (2021-12-14T23:30:42Z) - KGE-CL: Contrastive Learning of Knowledge Graph Embeddings [64.67579344758214]
We propose a simple yet efficient contrastive learning framework for knowledge graph embeddings.
It can shorten the semantic distance of the related entities and entity-relation couples in different triples.
It can yield some new state-of-the-art results, achieving 51.2% MRR, 46.8% Hits@1 on the WN18RR dataset, and 59.1% MRR, 51.8% Hits@1 on the YAGO3-10 dataset.
arXiv Detail & Related papers (2021-12-09T12:45:33Z) - Investigating Pretrained Language Models for Graph-to-Text Generation [55.55151069694146]
Graph-to-text generation aims to generate fluent texts from graph-based data.
We present a study across three graph domains: meaning representations, Wikipedia knowledge graphs (KGs) and scientific KGs.
We show that the PLMs BART and T5 achieve new state-of-the-art results and that task-adaptive pretraining strategies improve their performance even further.
arXiv Detail & Related papers (2020-07-16T16:05:34Z) - Heuristic Semi-Supervised Learning for Graph Generation Inspired by
Electoral College [80.67842220664231]
We propose a novel pre-processing technique, namely ELectoral COllege (ELCO), which automatically expands new nodes and edges to refine the label similarity within a dense subgraph.
In all setups tested, our method boosts the average score of base models by a large margin of 4.7 points, as well as consistently outperforms the state-of-the-art.
arXiv Detail & Related papers (2020-06-10T14:48:48Z) - Structure-Augmented Text Representation Learning for Efficient Knowledge
Graph Completion [53.31911669146451]
Human-curated knowledge graphs provide critical supportive information to various natural language processing tasks.
These graphs are usually incomplete, urging auto-completion of them.
graph embedding approaches, e.g., TransE, learn structured knowledge via representing graph elements into dense embeddings.
textual encoding approaches, e.g., KG-BERT, resort to graph triple's text and triple-level contextualized representations.
arXiv Detail & Related papers (2020-04-30T13:50:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.