Knowledge Graph Embeddings: A Comprehensive Survey on Capturing Relation Properties
- URL: http://arxiv.org/abs/2410.14733v1
- Date: Wed, 16 Oct 2024 08:54:52 GMT
- Title: Knowledge Graph Embeddings: A Comprehensive Survey on Capturing Relation Properties
- Authors: Guanglin Niu,
- Abstract summary: Knowledge Graph Embedding (KGE) techniques play a pivotal role in transforming symbolic Knowledge Graphs into numerical representations.
This paper addresses the complex mapping properties inherent in relations, such as one-to-one, one-to-many, many-to-one, and many-to-many mappings.
We explore innovative ideas such as integrating multimodal information into KGE, enhancing relation pattern modeling with rules, and developing models to capture relation characteristics in dynamic KGE settings.
- Score: 5.651919225343915
- License:
- Abstract: Knowledge Graph Embedding (KGE) techniques play a pivotal role in transforming symbolic Knowledge Graphs (KGs) into numerical representations, thereby enhancing various deep learning models for knowledge-augmented applications. Unlike entities, relations in KGs are the carriers of semantic meaning, and their accurate modeling is crucial for the performance of KGE models. Firstly, we address the complex mapping properties inherent in relations, such as one-to-one, one-to-many, many-to-one, and many-to-many mappings. We provide a comprehensive summary of relation-aware mapping-based models, models that utilize specific representation spaces, tensor decomposition-based models, and neural network-based models. Next, focusing on capturing various relation patterns like symmetry, asymmetry, inversion, and composition, we review models that employ modified tensor decomposition, those based on modified relation-aware mappings, and those that leverage rotation operations. Subsequently, considering the implicit hierarchical relations among entities, we introduce models that incorporate auxiliary information, models based on hyperbolic spaces, and those that utilize the polar coordinate system. Finally, in response to more complex scenarios such as sparse and dynamic KGs, this paper discusses potential future research directions. We explore innovative ideas such as integrating multimodal information into KGE, enhancing relation pattern modeling with rules, and developing models to capture relation characteristics in dynamic KGE settings.
Related papers
- Relational Learning in Pre-Trained Models: A Theory from Hypergraph Recovery Perspective [60.64922606733441]
We introduce a mathematical model that formalizes relational learning as hypergraph recovery to study pre-training of Foundation Models (FMs)
In our framework, the world is represented as a hypergraph, with data abstracted as random samples from hyperedges. We theoretically examine the feasibility of a Pre-Trained Model (PTM) to recover this hypergraph and analyze the data efficiency in a minimax near-optimal style.
arXiv Detail & Related papers (2024-06-17T06:20:39Z) - Block-Diagonal Orthogonal Relation and Matrix Entity for Knowledge Graph Embedding [5.463034010805521]
A Knowledge Graph embeddings (KGE) is to learn low-dimensional representations of entities and relations for predicting missing facts.
We introduce OrthogonalE, a novel KGE model employing matrices for entities and block-diagonal matrices for relations.
The experimental results indicate that our new KGE model, OrthogonalE, is both general and flexible, significantly outperforming state-of-the-art KGE models.
arXiv Detail & Related papers (2024-01-11T15:13:00Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Unified Graph Structured Models for Video Understanding [93.72081456202672]
We propose a message passing graph neural network that explicitly models relational-temporal relations.
We show how our method is able to more effectively model relationships between relevant entities in the scene.
arXiv Detail & Related papers (2021-03-29T14:37:35Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Knowledge Graph Embeddings in Geometric Algebras [14.269860621624392]
We introduce a novel geometric algebra-based KG embedding framework, GeomE.
Our framework subsumes several state-of-the-art KG embedding approaches and is advantageous with its ability of modeling various key relation patterns.
Experimental results on multiple benchmark knowledge graphs show that theproposed approach outperforms existing state-of-the-art models for link prediction.
arXiv Detail & Related papers (2020-10-02T13:36:24Z) - AutoETER: Automated Entity Type Representation for Knowledge Graph
Embedding [40.900070190077024]
We develop a novel Knowledge Graph Embedding (KGE) framework with Automated Entity TypE Representation (AutoETER)
Our approach could model and infer all the relation patterns and complex relations.
Experiments on four datasets demonstrate the superior performance of our model compared to state-of-the-art baselines on link prediction tasks.
arXiv Detail & Related papers (2020-09-25T04:27:35Z) - DensE: An Enhanced Non-commutative Representation for Knowledge Graph
Embedding with Adaptive Semantic Hierarchy [4.607120217372668]
We develop a novel knowledge graph embedding method, named DensE, to provide an improved modeling scheme for the complex composition patterns of relations.
Our method decomposes each relation into an SO(3) group-based rotation operator and a scaling operator in the three dimensional (3-D) Euclidean space.
Experimental results on multiple benchmark knowledge graphs show that DensE outperforms the current state-of-the-art models for missing link prediction.
arXiv Detail & Related papers (2020-08-11T06:45:50Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.