Knowledge Is Flat: A Seq2Seq Generative Framework for Various Knowledge
Graph Completion
- URL: http://arxiv.org/abs/2209.07299v2
- Date: Fri, 16 Sep 2022 08:15:55 GMT
- Title: Knowledge Is Flat: A Seq2Seq Generative Framework for Various Knowledge
Graph Completion
- Authors: Chen Chen, Yufei Wang, Bing Li and Kwok-Yan Lam
- Abstract summary: KG-S2S is a Seq2Seq generative framework that could tackle different verbalizable graph structures.
We show that KG-S2S outperforms many competitive baselines.
- Score: 18.581223721903147
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge Graph Completion (KGC) has been recently extended to multiple
knowledge graph (KG) structures, initiating new research directions, e.g.
static KGC, temporal KGC and few-shot KGC. Previous works often design KGC
models closely coupled with specific graph structures, which inevitably results
in two drawbacks: 1) structure-specific KGC models are mutually incompatible;
2) existing KGC methods are not adaptable to emerging KGs. In this paper, we
propose KG-S2S, a Seq2Seq generative framework that could tackle different
verbalizable graph structures by unifying the representation of KG facts into
"flat" text, regardless of their original form. To remedy the KG structure
information loss from the "flat" text, we further improve the input
representations of entities and relations, and the inference algorithm in
KG-S2S. Experiments on five benchmarks show that KG-S2S outperforms many
competitive baselines, setting new state-of-the-art performance. Finally, we
analyze KG-S2S's ability on the different relations and the Non-entity
Generations.
Related papers
- Logical Reasoning with Relation Network for Inductive Knowledge Graph Completion [9.815135283458808]
We propose a novel iNfOmax RelAtion Network, namely NORAN, for inductive KG completion.
Our framework substantially outperforms the state-of-the-art KGC methods.
arXiv Detail & Related papers (2024-06-03T09:30:43Z) - Collective Knowledge Graph Completion with Mutual Knowledge Distillation [11.922522192224145]
We study the problem of multi-KG completion, where we focus on maximizing the collective knowledge from different KGs.
We propose a novel method called CKGC-CKD that uses relation-aware graph convolutional network encoder models on both individual KGs and a large fused KG.
Experimental results on multilingual datasets have shown that our method outperforms all state-of-the-art models in the KGC task.
arXiv Detail & Related papers (2023-05-25T09:49:40Z) - A Survey On Few-shot Knowledge Graph Completion with Structural and
Commonsense Knowledge [3.4012007729454807]
Few-shot KG completion (FKGC) requires the strengths of graph representation learning and few-shot learning.
This paper introduces FKGC challenges, commonly used KGs, and CKGs.
We then systematically categorize and summarize existing works in terms of the type of KGs and the methods.
arXiv Detail & Related papers (2023-01-03T16:00:09Z) - Joint Multilingual Knowledge Graph Completion and Alignment [22.87219447169727]
We propose a novel model for jointly completing and aligning knowledge graphs.
The proposed model combines two components that jointly accomplish KG completion and alignment.
We also propose a structural inconsistency reduction mechanism to incorporate information from the completion into the alignment component.
arXiv Detail & Related papers (2022-10-17T10:25:10Z) - Reasoning over Multi-view Knowledge Graphs [59.99051368907095]
ROMA is a novel framework for answering logical queries over multi-view KGs.
It scales up to KGs of large sizes (e.g., millions of facts) and fine-granular views.
It generalizes to query structures and KG views that are unobserved during training.
arXiv Detail & Related papers (2022-09-27T21:32:20Z) - Disconnected Emerging Knowledge Graph Oriented Inductive Link Prediction [0.0]
We propose a novel model entitled DEKG-ILP (Disconnected Emerging Knowledge Graph Oriented Inductive Link Prediction)
The module CLRM is developed to extract global relation-based semantic features that are shared between original KGs and DEKGs.
The module GSM is proposed to extract the local subgraph topological information around each link in KGs.
arXiv Detail & Related papers (2022-09-03T10:58:24Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - ExpressivE: A Spatio-Functional Embedding For Knowledge Graph Completion [78.8942067357231]
ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space.
We show that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on W18RR.
arXiv Detail & Related papers (2022-06-08T23:34:39Z) - Rethinking Graph Convolutional Networks in Knowledge Graph Completion [83.25075514036183]
Graph convolutional networks (GCNs) have been increasingly popular in knowledge graph completion (KGC)
In this paper, we build upon representative GCN-based KGC models and introduce variants to find which factor of GCNs is critical in KGC.
We propose a simple yet effective framework named LTE-KGE, which equips existing KGE models with linearly transformed entity embeddings.
arXiv Detail & Related papers (2022-02-08T11:36:18Z) - Temporal Knowledge Graph Reasoning Based on Evolutional Representation
Learning [59.004025528223025]
Key to predict future facts is to thoroughly understand the historical facts.
A TKG is actually a sequence of KGs corresponding to different timestamps.
We propose a novel Recurrent Evolution network based on Graph Convolution Network (GCN)
arXiv Detail & Related papers (2021-04-21T05:12:21Z) - Toward Subgraph-Guided Knowledge Graph Question Generation with Graph
Neural Networks [53.58077686470096]
Knowledge graph (KG) question generation (QG) aims to generate natural language questions from KGs and target answers.
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
arXiv Detail & Related papers (2020-04-13T15:43:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.