Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the
Federated Setting
- URL: http://arxiv.org/abs/2205.04692v1
- Date: Tue, 10 May 2022 06:27:32 GMT
- Title: Meta-Learning Based Knowledge Extrapolation for Knowledge Graphs in the
Federated Setting
- Authors: Mingyang Chen, Wen Zhang, Zhen Yao, Xiangnan Chen, Mengxiao Ding, Fei
Huang, Huajun Chen
- Abstract summary: We study the knowledge extrapolation problem to embed new components (i.e., entities and relations) that come with emerging knowledge graphs (KGs) in the federated setting.
In this problem, a model trained on an existing KG needs to embed an emerging KG with unseen entities and relations.
We introduce the meta-learning setting, where a set of tasks are sampled on the existing KG to mimic the link prediction task on the emerging KG.
Based on sampled tasks, we meta-train a graph neural network framework that can construct features for unseen components based on structural information and output embeddings for them.
- Score: 43.85991094675398
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the knowledge extrapolation problem to embed new components (i.e.,
entities and relations) that come with emerging knowledge graphs (KGs) in the
federated setting. In this problem, a model trained on an existing KG needs to
embed an emerging KG with unseen entities and relations. To solve this problem,
we introduce the meta-learning setting, where a set of tasks are sampled on the
existing KG to mimic the link prediction task on the emerging KG. Based on
sampled tasks, we meta-train a graph neural network framework that can
construct features for unseen components based on structural information and
output embeddings for them. Experimental results show that our proposed method
can effectively embed unseen components and outperforms models that consider
inductive settings for KGs and baselines that directly use conventional KG
embedding methods.
Related papers
- Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency [59.6772484292295]
Knowledge graphs (KGs) generated by large language models (LLMs) are increasingly valuable for Retrieval-Augmented Generation (RAG) applications.
Existing KG extraction methods rely on prompt-based approaches, which are inefficient for processing large-scale corpora.
We propose SynthKG, a multi-step, document-level synthesis KG workflow based on LLMs.
We also design a novel graph-based retrieval framework for RAG.
arXiv Detail & Related papers (2024-10-22T00:47:54Z) - KG-FIT: Knowledge Graph Fine-Tuning Upon Open-World Knowledge [63.19837262782962]
Knowledge Graph Embedding (KGE) techniques are crucial in learning compact representations of entities and relations within a knowledge graph.
This study introduces KG-FIT, which builds a semantically coherent hierarchical structure of entity clusters.
Experiments on the benchmark datasets FB15K-237, YAGO3-10, and PrimeKG demonstrate the superiority of KG-FIT over state-of-the-art pre-trained language model-based methods.
arXiv Detail & Related papers (2024-05-26T03:04:26Z) - Few-Shot Inductive Learning on Temporal Knowledge Graphs using
Concept-Aware Information [31.10140298420744]
We propose a few-shot out-of-graph (OOG) link prediction task for temporal knowledge graphs (TKGs)
We predict the missing entities from the links concerning unseen entities by employing a meta-learning framework.
Our model achieves superior performance on all three datasets.
arXiv Detail & Related papers (2022-11-15T14:23:07Z) - Disconnected Emerging Knowledge Graph Oriented Inductive Link Prediction [0.0]
We propose a novel model entitled DEKG-ILP (Disconnected Emerging Knowledge Graph Oriented Inductive Link Prediction)
The module CLRM is developed to extract global relation-based semantic features that are shared between original KGs and DEKGs.
The module GSM is proposed to extract the local subgraph topological information around each link in KGs.
arXiv Detail & Related papers (2022-09-03T10:58:24Z) - BertNet: Harvesting Knowledge Graphs with Arbitrary Relations from
Pretrained Language Models [65.51390418485207]
We propose a new approach of harvesting massive KGs of arbitrary relations from pretrained LMs.
With minimal input of a relation definition, the approach efficiently searches in the vast entity pair space to extract diverse accurate knowledge.
We deploy the approach to harvest KGs of over 400 new relations from different LMs.
arXiv Detail & Related papers (2022-06-28T19:46:29Z) - Standing on the Shoulders of Predecessors: Meta-Knowledge Transfer for
Knowledge Graphs [8.815143812846392]
We call such knowledge meta-knowledge, and refer to the problem of transferring meta-knowledge from constructed (source) KGs to new (target) KGs.
MorsE represents the meta-knowledge via Knowledge Graph Embedding and learns the meta-knowledge by Meta-Learning.
MorsE is able to learn and transfer meta-knowledge between KGs effectively, and outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2021-10-27T04:57:16Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - PPKE: Knowledge Representation Learning by Path-based Pre-training [43.41597219004598]
We propose a Path-based Pre-training model to learn Knowledge Embeddings, called PPKE.
Our model achieves state-of-the-art results on several benchmark datasets for link prediction and relation prediction tasks.
arXiv Detail & Related papers (2020-12-07T10:29:30Z) - Relational Learning Analysis of Social Politics using Knowledge Graph
Embedding [11.978556412301975]
This paper presents a novel credibility domain-based KG Embedding framework.
It involves capturing a fusion of data obtained from heterogeneous resources into a formal KG representation depicted by a domain.
The framework also embodies a credibility module to ensure data quality and trustworthiness.
arXiv Detail & Related papers (2020-06-02T14:10:28Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.