CL4KGE: A Curriculum Learning Method for Knowledge Graph Embedding
- URL: http://arxiv.org/abs/2408.14840v2
- Date: Mon, 9 Sep 2024 06:57:22 GMT
- Title: CL4KGE: A Curriculum Learning Method for Knowledge Graph Embedding
- Authors: Yang Liu, Chuan Zhou, Peng Zhang, Yanan Cao, Yongchao Liu, Zhao Li, Hongyang Chen,
- Abstract summary: We define a metric Z-counts to measure the difficulty of training each triple in knowledge graphs.
Based on this metric, we propose textbfCL4KGE, an efficient textbfCurriculum textbfLearning based training strategy.
- Score: 36.47838597326351
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graph embedding (KGE) constitutes a foundational task, directed towards learning representations for entities and relations within knowledge graphs (KGs), with the objective of crafting representations comprehensive enough to approximate the logical and symbolic interconnections among entities. In this paper, we define a metric Z-counts to measure the difficulty of training each triple ($<$head entity, relation, tail entity$>$) in KGs with theoretical analysis. Based on this metric, we propose \textbf{CL4KGE}, an efficient \textbf{C}urriculum \textbf{L}earning based training strategy for \textbf{KGE}. This method includes a difficulty measurer and a training scheduler that aids in the training of KGE models. Our approach possesses the flexibility to act as a plugin within a wide range of KGE models, with the added advantage of adaptability to the majority of KGs in existence. The proposed method has been evaluated on popular KGE models, and the results demonstrate that it enhances the state-of-the-art methods. The use of Z-counts as a metric has enabled the identification of challenging triples in KGs, which helps in devising effective training strategies.
Related papers
- KG-FIT: Knowledge Graph Fine-Tuning Upon Open-World Knowledge [63.19837262782962]
Knowledge Graph Embedding (KGE) techniques are crucial in learning compact representations of entities and relations within a knowledge graph.
This study introduces KG-FIT, which builds a semantically coherent hierarchical structure of entity clusters.
Experiments on the benchmark datasets FB15K-237, YAGO3-10, and PrimeKG demonstrate the superiority of KG-FIT over state-of-the-art pre-trained language model-based methods.
arXiv Detail & Related papers (2024-05-26T03:04:26Z) - KICGPT: Large Language Model with Knowledge in Context for Knowledge
Graph Completion [27.405080941584533]
We propose KICGPT, a framework that integrates a large language model and a triple-based KGC retriever.
It alleviates the long-tail problem without incurring additional training overhead.
Empirical results on benchmark datasets demonstrate the effectiveness of KICGPT with smaller training overhead and no finetuning.
arXiv Detail & Related papers (2024-02-04T08:01:07Z) - Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - Knowledge Graph Embedding: An Overview [42.16033541753744]
We make a comprehensive overview of the current state of research in Knowledge Graph completion.
We focus on two main branches of KG embedding (KGE) design: 1) distance-based methods and 2) semantic matching-based methods.
Next, we delve into CompoundE and CompoundE3D, which draw inspiration from 2D and 3D affine operations.
arXiv Detail & Related papers (2023-09-21T21:52:42Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - Schema First! Learn Versatile Knowledge Graph Embeddings by Capturing
Semantics with MASCHInE [3.174882428337821]
Knowledge graph embedding models (KGEMs) have gained considerable traction in recent years.
In this work, we design protographs -- small, modified versions of a KG that leverage RDF/S information.
The learnt protograph-based embeddings are meant to encapsulate the semantics of a KG, and can be leveraged in learning KGEs that, in turn, also better capture semantics.
arXiv Detail & Related papers (2023-06-06T13:22:54Z) - Knowledge Graph Contrastive Learning Based on Relation-Symmetrical
Structure [36.507635518425744]
We propose a knowledge graph contrastive learning framework based on relation-symmetrical structure, KGE-SymCL.
Our framework mines symmetrical structure information in KGs to enhance the discriminative ability of KGE models.
arXiv Detail & Related papers (2022-11-19T16:30:29Z) - ExpressivE: A Spatio-Functional Embedding For Knowledge Graph Completion [78.8942067357231]
ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space.
We show that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on W18RR.
arXiv Detail & Related papers (2022-06-08T23:34:39Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.