Collective Knowledge Graph Completion with Mutual Knowledge Distillation
- URL: http://arxiv.org/abs/2305.15895v1
- Date: Thu, 25 May 2023 09:49:40 GMT
- Title: Collective Knowledge Graph Completion with Mutual Knowledge Distillation
- Authors: Weihang Zhang, Ovidiu Serban, Jiahao Sun, Yi-ke Guo
- Abstract summary: We study the problem of multi-KG completion, where we focus on maximizing the collective knowledge from different KGs.
We propose a novel method called CKGC-CKD that uses relation-aware graph convolutional network encoder models on both individual KGs and a large fused KG.
Experimental results on multilingual datasets have shown that our method outperforms all state-of-the-art models in the KGC task.
- Score: 11.922522192224145
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graph completion (KGC), the task of predicting missing information
based on the existing relational data inside a knowledge graph (KG), has drawn
significant attention in recent years. However, the predictive power of KGC
methods is often limited by the completeness of the existing knowledge graphs
from different sources and languages. In monolingual and multilingual settings,
KGs are potentially complementary to each other. In this paper, we study the
problem of multi-KG completion, where we focus on maximizing the collective
knowledge from different KGs to alleviate the incompleteness of individual KGs.
Specifically, we propose a novel method called CKGC-CKD that uses
relation-aware graph convolutional network encoder models on both individual
KGs and a large fused KG in which seed alignments between KGs are regarded as
edges for message propagation. An additional mutual knowledge distillation
mechanism is also employed to maximize the knowledge transfer between the
models of "global" fused KG and the "local" individual KGs. Experimental
results on multilingual datasets have shown that our method outperforms all
state-of-the-art models in the KGC task.
Related papers
- Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency [59.6772484292295]
Knowledge graphs (KGs) generated by large language models (LLMs) are increasingly valuable for Retrieval-Augmented Generation (RAG) applications.
Existing KG extraction methods rely on prompt-based approaches, which are inefficient for processing large-scale corpora.
We propose SynthKG, a multi-step, document-level synthesis KG workflow based on LLMs.
We also design a novel graph-based retrieval framework for RAG.
arXiv Detail & Related papers (2024-10-22T00:47:54Z) - KG-FIT: Knowledge Graph Fine-Tuning Upon Open-World Knowledge [63.19837262782962]
Knowledge Graph Embedding (KGE) techniques are crucial in learning compact representations of entities and relations within a knowledge graph.
This study introduces KG-FIT, which builds a semantically coherent hierarchical structure of entity clusters.
Experiments on the benchmark datasets FB15K-237, YAGO3-10, and PrimeKG demonstrate the superiority of KG-FIT over state-of-the-art pre-trained language model-based methods.
arXiv Detail & Related papers (2024-05-26T03:04:26Z) - Generate-on-Graph: Treat LLM as both Agent and KG in Incomplete Knowledge Graph Question Answering [87.67177556994525]
We propose a training-free method called Generate-on-Graph (GoG) to generate new factual triples while exploring Knowledge Graphs (KGs)
GoG performs reasoning through a Thinking-Searching-Generating framework, which treats LLM as both Agent and KG in IKGQA.
arXiv Detail & Related papers (2024-04-23T04:47:22Z) - FedMKGC: Privacy-Preserving Federated Multilingual Knowledge Graph
Completion [21.4302940596294]
Knowledge graph completion (KGC) aims to predict missing facts in knowledge graphs (KGs)
Previous methods that rely on transferring raw data among KGs raise privacy concerns.
We propose a new federated learning framework that implicitly aggregates knowledge from multiple KGs without demanding raw data exchange and entity alignment.
arXiv Detail & Related papers (2023-12-17T08:09:27Z) - A Survey On Few-shot Knowledge Graph Completion with Structural and
Commonsense Knowledge [3.4012007729454807]
Few-shot KG completion (FKGC) requires the strengths of graph representation learning and few-shot learning.
This paper introduces FKGC challenges, commonly used KGs, and CKGs.
We then systematically categorize and summarize existing works in terms of the type of KGs and the methods.
arXiv Detail & Related papers (2023-01-03T16:00:09Z) - Collaborative Knowledge Graph Fusion by Exploiting the Open Corpus [59.20235923987045]
It is challenging to enrich a Knowledge Graph with newly harvested triples while maintaining the quality of the knowledge representation.
This paper proposes a system to refine a KG using information harvested from an additional corpus.
arXiv Detail & Related papers (2022-06-15T12:16:10Z) - Towards Robust Knowledge Graph Embedding via Multi-task Reinforcement
Learning [44.38215560989223]
Most existing knowledge graph embedding methods assume that all the triple facts in KGs are correct.
This will lead to low-quality and unreliable representations of KGs.
We propose a general multi-task reinforcement learning framework, which can greatly alleviate the noisy data problem.
arXiv Detail & Related papers (2021-11-11T08:51:37Z) - FedE: Embedding Knowledge Graphs in Federated Setting [21.022513922373207]
Multi-Source KG is a common situation in real Knowledge Graph applications.
Because of the data privacy and sensitivity, a set of relevant knowledge graphs cannot complement each other's KGC by just collecting data from different knowledge graphs together.
We propose a Federated Knowledge Graph Embedding framework FedE, focusing on learning knowledge graph embeddings by aggregating locally-computed updates.
arXiv Detail & Related papers (2020-10-24T11:52:05Z) - Multilingual Knowledge Graph Completion via Ensemble Knowledge Transfer [43.453915033312114]
Predicting missing facts in a knowledge graph (KG) is a crucial task in knowledge base construction and reasoning.
We propose KEnS, a novel framework for embedding learning and ensemble knowledge transfer across a number of language-specific KGs.
Experiments on five real-world language-specific KGs show that KEnS consistently improves state-of-the-art methods on KG completion.
arXiv Detail & Related papers (2020-10-07T04:54:03Z) - On the Role of Conceptualization in Commonsense Knowledge Graph
Construction [59.39512925793171]
Commonsense knowledge graphs (CKGs) like Atomic and ASER are substantially different from conventional KGs.
We introduce to CKG construction methods conceptualization to view entities mentioned in text as instances of specific concepts or vice versa.
Our methods can effectively identify plausible triples and expand the KG by triples of both new nodes and edges of high diversity and novelty.
arXiv Detail & Related papers (2020-03-06T14:35:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.