Inductively Representing Out-of-Knowledge-Graph Entities by Optimal
Estimation Under Translational Assumptions
- URL: http://arxiv.org/abs/2009.12765v1
- Date: Sun, 27 Sep 2020 07:12:18 GMT
- Title: Inductively Representing Out-of-Knowledge-Graph Entities by Optimal
Estimation Under Translational Assumptions
- Authors: Damai Dai, Hua Zheng, Fuli Luo, Pengcheng Yang, Baobao Chang, Zhifang
Sui
- Abstract summary: We propose a simple and effective method that inductively represents OOKG entities by their optimal estimation under translational assumptions.
Experimental results show that our method outperforms the state-of-the-art methods with higher efficiency on two KGC tasks with OOKG entities.
- Score: 42.626395991024545
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conventional Knowledge Graph Completion (KGC) assumes that all test entities
appear during training. However, in real-world scenarios, Knowledge Graphs (KG)
evolve fast with out-of-knowledge-graph (OOKG) entities added frequently, and
we need to represent these entities efficiently. Most existing Knowledge Graph
Embedding (KGE) methods cannot represent OOKG entities without costly
retraining on the whole KG. To enhance efficiency, we propose a simple and
effective method that inductively represents OOKG entities by their optimal
estimation under translational assumptions. Given pretrained embeddings of the
in-knowledge-graph (IKG) entities, our method needs no additional learning.
Experimental results show that our method outperforms the state-of-the-art
methods with higher efficiency on two KGC tasks with OOKG entities.
Related papers
- One Subgraph for All: Efficient Reasoning on Opening Subgraphs for Inductive Knowledge Graph Completion [12.644979036930383]
Knowledge Graph Completion (KGC) has garnered massive research interest recently.
Most existing methods are designed following a transductive setting where all entities are observed during training.
In inductive KGC, which aims to deduce missing links among unseen entities, has become a new trend.
arXiv Detail & Related papers (2024-04-24T11:12:08Z) - Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - Unifying Structure and Language Semantic for Efficient Contrastive
Knowledge Graph Completion with Structured Entity Anchors [0.3913403111891026]
The goal of knowledge graph completion (KGC) is to predict missing links in a KG using trained facts that are already known.
We propose a novel method to effectively unify structure information and language semantics without losing the power of inductive reasoning.
arXiv Detail & Related papers (2023-11-07T11:17:55Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Text-Augmented Open Knowledge Graph Completion via Pre-Trained Language
Models [53.09723678623779]
We propose TAGREAL to automatically generate quality query prompts and retrieve support information from large text corpora.
The results show that TAGREAL achieves state-of-the-art performance on two benchmark datasets.
We find that TAGREAL has superb performance even with limited training data, outperforming existing embedding-based, graph-based, and PLM-based methods.
arXiv Detail & Related papers (2023-05-24T22:09:35Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Highly Efficient Knowledge Graph Embedding Learning with Orthogonal
Procrustes Analysis [10.154836127889487]
Knowledge Graph Embeddings (KGEs) have been intensively explored in recent years due to their promise for a wide range of applications.
This paper proposes a simple yet effective KGE framework which can reduce the training time and carbon footprint by orders of magnitudes.
arXiv Detail & Related papers (2021-04-10T03:55:45Z) - Inductive Learning on Commonsense Knowledge Graph Completion [89.72388313527296]
Commonsense knowledge graph (CKG) is a special type of knowledge graph (CKG) where entities are composed of free-form text.
We propose to study the inductive learning setting for CKG completion where unseen entities may present at test time.
InductivE significantly outperforms state-of-the-art baselines in both standard and inductive settings on ATOMIC and ConceptNet benchmarks.
arXiv Detail & Related papers (2020-09-19T16:10:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.