Standing on the Shoulders of Predecessors: Meta-Knowledge Transfer for
Knowledge Graphs
- URL: http://arxiv.org/abs/2110.14170v1
- Date: Wed, 27 Oct 2021 04:57:16 GMT
- Title: Standing on the Shoulders of Predecessors: Meta-Knowledge Transfer for
Knowledge Graphs
- Authors: Mingyang Chen, Wen Zhang, Yushan Zhu, Hongting Zhou, Zonggang Yuan,
Changliang Xu, Huajun Chen
- Abstract summary: We call such knowledge meta-knowledge, and refer to the problem of transferring meta-knowledge from constructed (source) KGs to new (target) KGs.
MorsE represents the meta-knowledge via Knowledge Graph Embedding and learns the meta-knowledge by Meta-Learning.
MorsE is able to learn and transfer meta-knowledge between KGs effectively, and outperforms existing state-of-the-art models.
- Score: 8.815143812846392
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graphs (KGs) have become widespread, and various knowledge graphs
are constructed incessantly to support many in-KG and out-of-KG applications.
During the construction of KGs, although new KGs may contain new entities with
respect to constructed KGs, some entity-independent knowledge can be
transferred from constructed KGs to new KGs. We call such knowledge
meta-knowledge, and refer to the problem of transferring meta-knowledge from
constructed (source) KGs to new (target) KGs to improve the performance of
tasks on target KGs as meta-knowledge transfer for knowledge graphs. However,
there is no available general framework that can tackle meta-knowledge transfer
for both in-KG and out-of-KG tasks uniformly. Therefore, in this paper, we
propose a framework, MorsE, which means conducting Meta-Learning for
Meta-Knowledge Transfer via Knowledge Graph Embedding. MorsE represents the
meta-knowledge via Knowledge Graph Embedding and learns the meta-knowledge by
Meta-Learning. Specifically, MorsE uses an entity initializer and a Graph
Neural Network (GNN) modulator to entity-independently obtain entity embeddings
given a KG and is trained following the meta-learning setting to gain the
ability of effectively obtaining embeddings. Experimental results on
meta-knowledge transfer for both in-KG and out-of-KG tasks show that MorsE is
able to learn and transfer meta-knowledge between KGs effectively, and
outperforms existing state-of-the-art models.
Related papers
- Decoding on Graphs: Faithful and Sound Reasoning on Knowledge Graphs through Generation of Well-Formed Chains [66.55612528039894]
Knowledge Graphs (KGs) can serve as reliable knowledge sources for question answering (QA)
We present DoG (Decoding on Graphs), a novel framework that facilitates a deep synergy between LLMs and KGs.
Experiments across various KGQA tasks with different background KGs demonstrate that DoG achieves superior and robust performance.
arXiv Detail & Related papers (2024-10-24T04:01:40Z) - KG-FIT: Knowledge Graph Fine-Tuning Upon Open-World Knowledge [63.19837262782962]
Knowledge Graph Embedding (KGE) techniques are crucial in learning compact representations of entities and relations within a knowledge graph.
This study introduces KG-FIT, which builds a semantically coherent hierarchical structure of entity clusters.
Experiments on the benchmark datasets FB15K-237, YAGO3-10, and PrimeKG demonstrate the superiority of KG-FIT over state-of-the-art pre-trained language model-based methods.
arXiv Detail & Related papers (2024-05-26T03:04:26Z) - Generate-on-Graph: Treat LLM as both Agent and KG in Incomplete Knowledge Graph Question Answering [87.67177556994525]
We propose a training-free method called Generate-on-Graph (GoG) to generate new factual triples while exploring Knowledge Graphs (KGs)
GoG performs reasoning through a Thinking-Searching-Generating framework, which treats LLM as both Agent and KG in IKGQA.
arXiv Detail & Related papers (2024-04-23T04:47:22Z) - Knowledge Graphs Meet Multi-Modal Learning: A Comprehensive Survey [61.8716670402084]
This survey focuses on KG-aware research in two principal aspects: KG-driven Multi-Modal (KG4MM) learning, and Multi-Modal Knowledge Graph (MM4KG)
Our review includes two primary task categories: KG-aware multi-modal learning tasks, and intrinsic MMKG tasks.
For most of these tasks, we provide definitions, evaluation benchmarks, and additionally outline essential insights for conducting relevant research.
arXiv Detail & Related papers (2024-02-08T04:04:36Z) - Unifying Large Language Models and Knowledge Graphs: A Roadmap [61.824618473293725]
Large language models (LLMs) are making new waves in the field of natural language processing and artificial intelligence.
Knowledge Graphs (KGs), Wikipedia and Huapu for example, are structured knowledge models that explicitly store rich factual knowledge.
arXiv Detail & Related papers (2023-06-14T07:15:26Z) - Joint Pre-training and Local Re-training: Transferable Representation
Learning on Multi-source Knowledge Graphs [17.78174810566248]
We pre-train a large teacher KG embedding model over linked multi-source KGs and distill knowledge to train a student model for a task-specific KG.
We conduct extensive experiments to demonstrate the effectiveness and efficiency of our framework.
arXiv Detail & Related papers (2023-06-05T08:11:59Z) - Collective Knowledge Graph Completion with Mutual Knowledge Distillation [11.922522192224145]
We study the problem of multi-KG completion, where we focus on maximizing the collective knowledge from different KGs.
We propose a novel method called CKGC-CKD that uses relation-aware graph convolutional network encoder models on both individual KGs and a large fused KG.
Experimental results on multilingual datasets have shown that our method outperforms all state-of-the-art models in the KGC task.
arXiv Detail & Related papers (2023-05-25T09:49:40Z) - Lifelong Embedding Learning and Transfer for Growing Knowledge Graphs [22.88552158340435]
Existing knowledge graph embedding models primarily focus on static KGs.
New facts and previously unseen entities and relations continually emerge, necessitating an embedding model that can quickly learn and transfer new knowledge through growth.
We consider knowledge transfer and retention of the learning on growing snapshots of a KG without having to learn embeddings from scratch.
The proposed model includes a masked KG autoencoder for embedding learning and update, with an embedding transfer strategy to inject the learned knowledge into the new entity and relation embeddings, and an embedding regularization method to avoid catastrophic forgetting.
arXiv Detail & Related papers (2022-11-29T00:43:44Z) - Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the
Federated Setting [43.85991094675398]
We study the knowledge extrapolation problem to embed new components (i.e., entities and relations) that come with emerging knowledge graphs (KGs) in the federated setting.
In this problem, a model trained on an existing KG needs to embed an emerging KG with unseen entities and relations.
We introduce the meta-learning setting, where a set of tasks are sampled on the existing KG to mimic the link prediction task on the emerging KG.
Based on sampled tasks, we meta-train a graph neural network framework that can construct features for unseen components based on structural information and output embeddings for them.
arXiv Detail & Related papers (2022-05-10T06:27:32Z) - Multilingual Knowledge Graph Completion via Ensemble Knowledge Transfer [43.453915033312114]
Predicting missing facts in a knowledge graph (KG) is a crucial task in knowledge base construction and reasoning.
We propose KEnS, a novel framework for embedding learning and ensemble knowledge transfer across a number of language-specific KGs.
Experiments on five real-world language-specific KGs show that KEnS consistently improves state-of-the-art methods on KG completion.
arXiv Detail & Related papers (2020-10-07T04:54:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.