A Survey On Few-shot Knowledge Graph Completion with Structural and
Commonsense Knowledge
- URL: http://arxiv.org/abs/2301.01172v1
- Date: Tue, 3 Jan 2023 16:00:09 GMT
- Title: A Survey On Few-shot Knowledge Graph Completion with Structural and
Commonsense Knowledge
- Authors: Haodi Ma, Daisy Zhe Wang
- Abstract summary: Few-shot KG completion (FKGC) requires the strengths of graph representation learning and few-shot learning.
This paper introduces FKGC challenges, commonly used KGs, and CKGs.
We then systematically categorize and summarize existing works in terms of the type of KGs and the methods.
- Score: 3.4012007729454807
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graphs (KG) have served as the key component of various natural
language processing applications. Commonsense knowledge graphs (CKG) are a
special type of KG, where entities and relations are composed of free-form
text. However, previous works in KG completion and CKG completion suffer from
long-tail relations and newly-added relations which do not have many know
triples for training. In light of this, few-shot KG completion (FKGC), which
requires the strengths of graph representation learning and few-shot learning,
has been proposed to challenge the problem of limited annotated data. In this
paper, we comprehensively survey previous attempts on such tasks in the form of
a series of methods and applications. Specifically, we first introduce FKGC
challenges, commonly used KGs, and CKGs. Then we systematically categorize and
summarize existing works in terms of the type of KGs and the methods. Finally,
we present applications of FKGC models on prediction tasks in different areas
and share our thoughts on future research directions of FKGC.
Related papers
- Ontology-grounded Automatic Knowledge Graph Construction by LLM under Wikidata schema [60.42231674887294]
We propose an ontology-grounded approach to Knowledge Graph (KG) construction using Large Language Models (LLMs) on a knowledge base.
We ground generation of KG with the authored ontology based on extracted relations to ensure consistency and interpretability.
Our work presents a promising direction for scalable KG construction pipeline with minimal human intervention, that yields high quality and human-interpretable KGs.
arXiv Detail & Related papers (2024-12-30T13:36:05Z) - A Survey on Knowledge Graph Structure and Knowledge Graph Embeddings [2.2690868277262486]
This paper provides, to the authors' knowledge, the first comprehensive survey exploring established relationships of Knowledge Graph Embedding Models and Graph structure in the literature.
It is the hope of the authors that this work will inspire further studies in this area, and contribute to a more holistic understanding of KGs, KGEMs, and the link prediction task.
arXiv Detail & Related papers (2024-12-13T12:30:09Z) - Distill-SynthKG: Distilling Knowledge Graph Synthesis Workflow for Improved Coverage and Efficiency [59.6772484292295]
Knowledge graphs (KGs) generated by large language models (LLMs) are increasingly valuable for Retrieval-Augmented Generation (RAG) applications.
Existing KG extraction methods rely on prompt-based approaches, which are inefficient for processing large-scale corpora.
We propose SynthKG, a multi-step, document-level synthesis KG workflow based on LLMs.
We also design a novel graph-based retrieval framework for RAG.
arXiv Detail & Related papers (2024-10-22T00:47:54Z) - Knowledge Graph Completion using Structural and Textual Embeddings [0.0]
We propose a relations prediction model that harnesses both textual and structural information within Knowledge Graphs.
Our approach integrates walks-based embeddings with language model embeddings to effectively represent nodes.
We demonstrate that our model achieves competitive results in the relation prediction task when evaluated on a widely used dataset.
arXiv Detail & Related papers (2024-04-24T21:04:14Z) - Multi-perspective Improvement of Knowledge Graph Completion with Large
Language Models [95.31941227776711]
We propose MPIKGC to compensate for the deficiency of contextualized knowledge and improve KGC by querying large language models (LLMs)
We conducted extensive evaluation of our framework based on four description-based KGC models and four datasets, for both link prediction and triplet classification tasks.
arXiv Detail & Related papers (2024-03-04T12:16:15Z) - KG-GPT: A General Framework for Reasoning on Knowledge Graphs Using
Large Language Models [18.20425100517317]
We propose KG-GPT, a framework leveraging large language models for tasks employing knowledge graphs.
KG-GPT comprises three steps: Sentence, Graph Retrieval, and Inference, each aimed at partitioning sentences, retrieving relevant graph components, and deriving logical conclusions.
We evaluate KG-GPT using KG-based fact verification and KGQA benchmarks, with the model showing competitive and robust performance, even outperforming several fully-supervised models.
arXiv Detail & Related papers (2023-10-17T12:51:35Z) - Collective Knowledge Graph Completion with Mutual Knowledge Distillation [11.922522192224145]
We study the problem of multi-KG completion, where we focus on maximizing the collective knowledge from different KGs.
We propose a novel method called CKGC-CKD that uses relation-aware graph convolutional network encoder models on both individual KGs and a large fused KG.
Experimental results on multilingual datasets have shown that our method outperforms all state-of-the-art models in the KGC task.
arXiv Detail & Related papers (2023-05-25T09:49:40Z) - Normalizing Flow-based Neural Process for Few-Shot Knowledge Graph
Completion [69.55700751102376]
Few-shot knowledge graph completion (FKGC) aims to predict missing facts for unseen relations with few-shot associated facts.
Existing FKGC methods are based on metric learning or meta-learning, which often suffer from the out-of-distribution and overfitting problems.
In this paper, we propose a normalizing flow-based neural process for few-shot knowledge graph completion (NP-FKGC)
arXiv Detail & Related papers (2023-04-17T11:42:28Z) - Reasoning over Multi-view Knowledge Graphs [59.99051368907095]
ROMA is a novel framework for answering logical queries over multi-view KGs.
It scales up to KGs of large sizes (e.g., millions of facts) and fine-granular views.
It generalizes to query structures and KG views that are unobserved during training.
arXiv Detail & Related papers (2022-09-27T21:32:20Z) - Multilingual Knowledge Graph Completion via Ensemble Knowledge Transfer [43.453915033312114]
Predicting missing facts in a knowledge graph (KG) is a crucial task in knowledge base construction and reasoning.
We propose KEnS, a novel framework for embedding learning and ensemble knowledge transfer across a number of language-specific KGs.
Experiments on five real-world language-specific KGs show that KEnS consistently improves state-of-the-art methods on KG completion.
arXiv Detail & Related papers (2020-10-07T04:54:03Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.