LightCAKE: A Lightweight Framework for Context-Aware Knowledge Graph
Embedding
- URL: http://arxiv.org/abs/2102.10826v1
- Date: Mon, 22 Feb 2021 08:23:22 GMT
- Title: LightCAKE: A Lightweight Framework for Context-Aware Knowledge Graph
Embedding
- Authors: Zhiyuan Ning, Ziyue Qiao, Hao Dong, Yi Du, Yuanchun Zhou
- Abstract summary: We propose a lightweight framework named LightCAKE for context-aware KGE.
LightCAKE uses an iterative aggregation strategy to integrate the context information in multi-hop into the entity/relation embeddings.
Experiments on public benchmarks demonstrate the efficiency and effectiveness of our framework.
- Score: 3.7497588668920048
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For knowledge graphs, knowledge graph embedding (KGE) models learn to project
the symbolic entities and relations into a low-dimensional continuous vector
space based on the observed triplets. However, existing KGE models can not make
a proper trade-off between the graph context and the model complexity, which
makes them still far from satisfactory. In this paper, we propose a lightweight
framework named LightCAKE for context-aware KGE. LightCAKE uses an iterative
aggregation strategy to integrate the context information in multi-hop into the
entity/relation embeddings, also explicitly models the graph context without
introducing extra trainable parameters other than embeddings. Moreover,
extensive experiments on public benchmarks demonstrate the efficiency and
effectiveness of our framework.
Related papers
- Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - CausE: Towards Causal Knowledge Graph Embedding [13.016173217017597]
Knowledge graph embedding (KGE) focuses on representing the entities and relations of a knowledge graph (KG) into the continuous vector spaces.
We build the new paradigm of KGE in the context of causality and embedding disentanglement.
We propose a Causality-enhanced knowledge graph Embedding (CausE) framework.
arXiv Detail & Related papers (2023-07-21T14:25:39Z) - Improving Knowledge Graph Entity Alignment with Graph Augmentation [11.1094009195297]
Entity alignment (EA) which links equivalent entities across different knowledge graphs (KGs) plays a crucial role in knowledge fusion.
In recent years, graph neural networks (GNNs) have been successfully applied in many embedding-based EA methods.
We propose graph augmentation to create two graph views for margin-based alignment learning and contrastive entity representation learning.
arXiv Detail & Related papers (2023-04-28T01:22:47Z) - Knowledge Graph Contrastive Learning Based on Relation-Symmetrical
Structure [36.507635518425744]
We propose a knowledge graph contrastive learning framework based on relation-symmetrical structure, KGE-SymCL.
Our framework mines symmetrical structure information in KGs to enhance the discriminative ability of KGE models.
arXiv Detail & Related papers (2022-11-19T16:30:29Z) - KRACL: Contrastive Learning with Graph Context Modeling for Sparse
Knowledge Graph Completion [37.92814873958519]
Knowledge Graph Embeddings (KGE) aim to map entities and relations to low dimensional spaces and have become the textitde-facto standard for knowledge graph completion.
Most existing KGE methods suffer from the sparsity challenge, where it is harder to predict entities that appear less frequently in knowledge graphs.
We propose a novel framework to alleviate the widespread sparsity in KGs with graph context and contrastive learning.
arXiv Detail & Related papers (2022-08-16T09:17:40Z) - ExpressivE: A Spatio-Functional Embedding For Knowledge Graph Completion [78.8942067357231]
ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space.
We show that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on W18RR.
arXiv Detail & Related papers (2022-06-08T23:34:39Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - DisenE: Disentangling Knowledge Graph Embeddings [33.169388832519]
DisenE is an end-to-end framework to learn disentangled knowledge graph embeddings.
We introduce an attention-based mechanism that enables the model to explicitly focus on relevant components of entity embeddings according to a given relation.
arXiv Detail & Related papers (2020-10-28T03:45:19Z) - SEEK: Segmented Embedding of Knowledge Graphs [77.5307592941209]
We propose a lightweight modeling framework that can achieve highly competitive relational expressiveness without increasing the model complexity.
Our framework focuses on the design of scoring functions and highlights two critical characteristics: 1) facilitating sufficient feature interactions; 2) preserving both symmetry and antisymmetry properties of relations.
arXiv Detail & Related papers (2020-05-02T15:15:50Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.