Knowledge Graph Contrastive Learning Based on Relation-Symmetrical
Structure
- URL: http://arxiv.org/abs/2211.10738v4
- Date: Tue, 13 Jun 2023 06:05:24 GMT
- Title: Knowledge Graph Contrastive Learning Based on Relation-Symmetrical
Structure
- Authors: Ke Liang, Yue Liu, Sihang Zhou, Wenxuan Tu, Yi Wen, Xihong Yang,
Xiangjun Dong, Xinwang Liu
- Abstract summary: We propose a knowledge graph contrastive learning framework based on relation-symmetrical structure, KGE-SymCL.
Our framework mines symmetrical structure information in KGs to enhance the discriminative ability of KGE models.
- Score: 36.507635518425744
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graph embedding (KGE) aims at learning powerful representations to
benefit various artificial intelligence applications. Meanwhile, contrastive
learning has been widely leveraged in graph learning as an effective mechanism
to enhance the discriminative capacity of the learned representations. However,
the complex structures of KG make it hard to construct appropriate contrastive
pairs. Only a few attempts have integrated contrastive learning strategies with
KGE. But, most of them rely on language models ( e.g., Bert) for contrastive
pair construction instead of fully mining information underlying the graph
structure, hindering expressive ability. Surprisingly, we find that the
entities within a relational symmetrical structure are usually similar and
correlated. To this end, we propose a knowledge graph contrastive learning
framework based on relation-symmetrical structure, KGE-SymCL, which mines
symmetrical structure information in KGs to enhance the discriminative ability
of KGE models. Concretely, a plug-and-play approach is proposed by taking
entities in the relation-symmetrical positions as positive pairs. Besides, a
self-supervised alignment loss is designed to pull together positive pairs.
Experimental results on link prediction and entity classification datasets
demonstrate that our KGE-SymCL can be easily adopted to various KGE models for
performance improvements. Moreover, extensive experiments show that our model
could outperform other state-of-the-art baselines.
Related papers
- CL4KGE: A Curriculum Learning Method for Knowledge Graph Embedding [36.47838597326351]
We define a metric Z-counts to measure the difficulty of training each triple in knowledge graphs.
Based on this metric, we propose textbfCL4KGE, an efficient textbfCurriculum textbfLearning based training strategy.
arXiv Detail & Related papers (2024-08-27T07:51:26Z) - Graph-level Protein Representation Learning by Structure Knowledge
Refinement [50.775264276189695]
This paper focuses on learning representation on the whole graph level in an unsupervised manner.
We propose a novel framework called Structure Knowledge Refinement (SKR) which uses data structure to determine the probability of whether a pair is positive or negative.
arXiv Detail & Related papers (2024-01-05T09:05:33Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - Improving Knowledge Graph Entity Alignment with Graph Augmentation [11.1094009195297]
Entity alignment (EA) which links equivalent entities across different knowledge graphs (KGs) plays a crucial role in knowledge fusion.
In recent years, graph neural networks (GNNs) have been successfully applied in many embedding-based EA methods.
We propose graph augmentation to create two graph views for margin-based alignment learning and contrastive entity representation learning.
arXiv Detail & Related papers (2023-04-28T01:22:47Z) - LightCAKE: A Lightweight Framework for Context-Aware Knowledge Graph
Embedding [3.7497588668920048]
We propose a lightweight framework named LightCAKE for context-aware KGE.
LightCAKE uses an iterative aggregation strategy to integrate the context information in multi-hop into the entity/relation embeddings.
Experiments on public benchmarks demonstrate the efficiency and effectiveness of our framework.
arXiv Detail & Related papers (2021-02-22T08:23:22Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - Motif Learning in Knowledge Graphs Using Trajectories Of Differential
Equations [14.279419014064047]
Knowledge Graph Embeddings (KGEs) have shown promising performance on link prediction tasks.
Many KGEs use the flat geometry which renders them incapable of preserving complex structures.
We propose a neuro differential KGE that embeds nodes of a KG on the trajectories of Ordinary Differential Equations (ODEs)
arXiv Detail & Related papers (2020-10-13T20:53:17Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.