Separate-and-Aggregate: A Transformer-based Patch Refinement Model for
Knowledge Graph Completion
- URL: http://arxiv.org/abs/2307.05627v1
- Date: Tue, 11 Jul 2023 06:27:13 GMT
- Title: Separate-and-Aggregate: A Transformer-based Patch Refinement Model for
Knowledge Graph Completion
- Authors: Chen Chen, Yufei Wang, Yang Zhang, Quan Z. Sheng, and Kwok-Yan Lam
- Abstract summary: We propose a novel Transformer-based Patch Refinement Model (PatReFormer) for Knowledge Graph completion.
We conduct experiments on four popular KGC benchmarks, WN18RR, FB15k-237, YAGO37 and DB100K.
The experimental results show significant performance improvement from existing KGC methods on standard KGC evaluation metrics.
- Score: 28.79628925695775
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graph completion (KGC) is the task of inferencing missing facts
from any given knowledge graphs (KG). Previous KGC methods typically represent
knowledge graph entities and relations as trainable continuous embeddings and
fuse the embeddings of the entity $h$ (or $t$) and relation $r$ into hidden
representations of query $(h, r, ?)$ (or $(?, r, t$)) to approximate the
missing entities. To achieve this, they either use shallow linear
transformations or deep convolutional modules. However, the linear
transformations suffer from the expressiveness issue while the deep
convolutional modules introduce unnecessary inductive bias, which could
potentially degrade the model performance. Thus, we propose a novel
Transformer-based Patch Refinement Model (PatReFormer) for KGC. PatReFormer
first segments the embedding into a sequence of patches and then employs
cross-attention modules to allow bi-directional embedding feature interaction
between the entities and relations, leading to a better understanding of the
underlying KG. We conduct experiments on four popular KGC benchmarks, WN18RR,
FB15k-237, YAGO37 and DB100K. The experimental results show significant
performance improvement from existing KGC methods on standard KGC evaluation
metrics, e.g., MRR and H@n. Our analysis first verifies the effectiveness of
our model design choices in PatReFormer. We then find that PatReFormer can
better capture KG information from a large relation embedding dimension.
Finally, we demonstrate that the strength of PatReFormer is at complex relation
types, compared to other KGC models
Related papers
- Knowledge Graph Embeddings: A Comprehensive Survey on Capturing Relation Properties [5.651919225343915]
Knowledge Graph Embedding (KGE) techniques play a pivotal role in transforming symbolic Knowledge Graphs into numerical representations.
This paper addresses the complex mapping properties inherent in relations, such as one-to-one, one-to-many, many-to-one, and many-to-many mappings.
We explore innovative ideas such as integrating multimodal information into KGE, enhancing relation pattern modeling with rules, and developing models to capture relation characteristics in dynamic KGE settings.
arXiv Detail & Related papers (2024-10-16T08:54:52Z) - Complex Logical Query Answering by Calibrating Knowledge Graph Completion Models [7.051174443949839]
A complex logical query answering task involves finding answer entities for complex logical queries over incomplete knowledge graphs.
Previous research has explored the use of pre-trained knowledge graph completion (KGC) models, which can predict the missing facts in KGs.
We propose a method for calibrating KGC models, namely CKGC, which enables KGC models to adapt to answering complex logical queries.
arXiv Detail & Related papers (2024-09-30T06:51:50Z) - Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - Pre-training Transformers for Knowledge Graph Completion [81.4078733132239]
We introduce a novel inductive KG representation model (iHT) for learning transferable representation for knowledge graphs.
iHT consists of a entity encoder (e.g., BERT) and a neighbor-aware relational scoring function both parameterized by Transformers.
Our approach achieves new state-of-the-art results on matched evaluations, with a relative improvement of more than 25% in mean reciprocal rank over previous SOTA models.
arXiv Detail & Related papers (2023-03-28T02:10:37Z) - KGxBoard: Explainable and Interactive Leaderboard for Evaluation of
Knowledge Graph Completion Models [76.01814380927507]
KGxBoard is an interactive framework for performing fine-grained evaluation on meaningful subsets of the data.
In our experiments, we highlight the findings with the use of KGxBoard, which would have been impossible to detect with standard averaged single-score metrics.
arXiv Detail & Related papers (2022-08-23T15:11:45Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Sequence-to-Sequence Knowledge Graph Completion and Question Answering [8.207403859762044]
We show that an off-the-shelf encoder-decoder Transformer model can serve as a scalable and versatile KGE model.
We achieve this by posing KG link prediction as a sequence-to-sequence task and exchange the triple scoring approach taken by prior KGE methods with autoregressive decoding.
arXiv Detail & Related papers (2022-03-19T13:01:49Z) - Rethinking Graph Convolutional Networks in Knowledge Graph Completion [83.25075514036183]
Graph convolutional networks (GCNs) have been increasingly popular in knowledge graph completion (KGC)
In this paper, we build upon representative GCN-based KGC models and introduce variants to find which factor of GCNs is critical in KGC.
We propose a simple yet effective framework named LTE-KGE, which equips existing KGE models with linearly transformed entity embeddings.
arXiv Detail & Related papers (2022-02-08T11:36:18Z) - Knowledge Generation -- Variational Bayes on Knowledge Graphs [0.685316573653194]
This thesis is a proof of concept for potential of Vari Auto-Encoder (VAE) on representation of real-world Knowledge Graphs.
Inspired by successful approaches to generation graphs, we evaluate the capabilities of our model, the Variational Auto-Encoder (RGVAE)
The RGVAE is first evaluated on link prediction. The mean reciprocal rank (MRR) scores on the two FB15K-237 and WN18RR datasets are compared.
We investigate the latent space in a twofold experiment: first, linear between the latent representation of two triples, then the exploration of each
arXiv Detail & Related papers (2021-01-21T21:23:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.