MUSE: Integrating Multi-Knowledge for Knowledge Graph Completion
- URL: http://arxiv.org/abs/2409.17536v1
- Date: Thu, 26 Sep 2024 04:48:20 GMT
- Title: MUSE: Integrating Multi-Knowledge for Knowledge Graph Completion
- Authors: Pengjie Liu,
- Abstract summary: Knowledge Graph Completion (KGC) aims to predict the missing [relation] part (head entity)--[relation]->(tail entity) triplet.
Most existing KGC methods focus on single features (e.g., relation types) or sub-graph aggregation.
We propose a knowledge-aware reasoning model (MUSE) which designs a novel multi-knowledge representation learning mechanism for missing relation prediction.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge Graph Completion (KGC) aims to predict the missing [relation] part of (head entity)--[relation]->(tail entity) triplet. Most existing KGC methods focus on single features (e.g., relation types) or sub-graph aggregation. However, they do not fully explore the Knowledge Graph (KG) features and neglect the guidance of external semantic knowledge. To address these shortcomings, we propose a knowledge-aware reasoning model (MUSE), which designs a novel multi-knowledge representation learning mechanism for missing relation prediction. Our model develops a tailored embedding space through three parallel components: 1) Prior Knowledge Learning for enhancing the triplets' semantic representation by fine-tuning BERT; 2) Context Message Passing for enhancing the context messages of KG; 3) Relational Path Aggregation for enhancing the path representation from the head entity to the tail entity. The experimental results show that MUSE significantly outperforms other baselines on four public datasets, achieving over 5.50% H@1 improvement and 4.20% MRR improvement on the NELL995 dataset. The code and datasets will be released via https://github.com/SUSTech-TP/ADMA2024-MUSE.git.
Related papers
- Efficient Relational Context Perception for Knowledge Graph Completion [25.903926643251076]
Knowledge Graphs (KGs) provide a structured representation of knowledge but often suffer from challenges of incompleteness.
Previous knowledge graph embedding models are limited in their ability to capture expressive features.
We propose Triple Receptance Perception architecture to model sequential information, enabling the learning of dynamic context.
arXiv Detail & Related papers (2024-12-31T11:25:58Z) - A Contextualized BERT model for Knowledge Graph Completion [0.0]
We introduce a contextualized BERT model for Knowledge Graph Completion (KGC)
Our model eliminates the need for entity descriptions and negative triplet sampling, reducing computational demands while improving performance.
Our model outperforms state-of-the-art methods on standard datasets, improving Hit@1 by 5.3% and 4.88% on FB15k-237 and WN18RR respectively.
arXiv Detail & Related papers (2024-12-15T02:03:16Z) - MUSE: Multi-Knowledge Passing on the Edges, Boosting Knowledge Graph Completion [0.0]
Knowledge Graph Completion aims to predict the missing information in the (head entity)-[relation]-(tail entity) triplet.
We propose MUSE, a knowledge-aware reasoning model to learn a tailored embedding space in three dimensions for missing relation prediction.
Our experimental results show that MUSE significantly outperforms other baselines on four public datasets.
arXiv Detail & Related papers (2024-08-09T18:10:02Z) - KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation [19.31783654838732]
We use large language models to generate coherent descriptions, bridging the semantic gap between queries and answers.
We also utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC.
Our approach achieves a 4.2% improvement in Hit@1 on WN18RR and a 3.4% improvement in Hit@3 on FB15k-237, demonstrating superior performance.
arXiv Detail & Related papers (2023-09-26T09:03:25Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - Good Visual Guidance Makes A Better Extractor: Hierarchical Visual
Prefix for Multimodal Entity and Relation Extraction [88.6585431949086]
We propose a novel Hierarchical Visual Prefix fusion NeTwork (HVPNeT) for visual-enhanced entity and relation extraction.
We regard visual representation as pluggable visual prefix to guide the textual representation for error insensitive forecasting decision.
Experiments on three benchmark datasets demonstrate the effectiveness of our method, and achieve state-of-the-art performance.
arXiv Detail & Related papers (2022-05-07T02:10:55Z) - From Discrimination to Generation: Knowledge Graph Completion with
Generative Transformer [41.69537736842654]
We provide an approach GenKGC, which converts knowledge graph completion to sequence-to-sequence generation task with the pre-trained language model.
We introduce relation-guided demonstration and entity-aware hierarchical decoding for better representation learning and fast inference.
We also release a new large-scale Chinese knowledge graph dataset AliopenKG500 for research purpose.
arXiv Detail & Related papers (2022-02-04T12:52:32Z) - KGE-CL: Contrastive Learning of Knowledge Graph Embeddings [64.67579344758214]
We propose a simple yet efficient contrastive learning framework for knowledge graph embeddings.
It can shorten the semantic distance of the related entities and entity-relation couples in different triples.
It can yield some new state-of-the-art results, achieving 51.2% MRR, 46.8% Hits@1 on the WN18RR dataset, and 59.1% MRR, 51.8% Hits@1 on the YAGO3-10 dataset.
arXiv Detail & Related papers (2021-12-09T12:45:33Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - KACC: A Multi-task Benchmark for Knowledge Abstraction, Concretization
and Completion [99.47414073164656]
A comprehensive knowledge graph (KG) contains an instance-level entity graph and an ontology-level concept graph.
The two-view KG provides a testbed for models to "simulate" human's abilities on knowledge abstraction, concretization, and completion.
We propose a unified KG benchmark by improving existing benchmarks in terms of dataset scale, task coverage, and difficulty.
arXiv Detail & Related papers (2020-04-28T16:21:57Z) - Relational Message Passing for Knowledge Graph Completion [78.47976646383222]
We propose a relational message passing method for knowledge graph completion.
It passes relational messages among edges iteratively to aggregate neighborhood information.
Results show our method outperforms stateof-the-art knowledge completion methods by a large margin.
arXiv Detail & Related papers (2020-02-17T03:33:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.