Rot-Pro: Modeling Transitivity by Projection in Knowledge Graph
Embedding
- URL: http://arxiv.org/abs/2110.14450v1
- Date: Wed, 27 Oct 2021 14:13:40 GMT
- Title: Rot-Pro: Modeling Transitivity by Projection in Knowledge Graph
Embedding
- Authors: Tengwei Song, Jie Luo, Lei Huang
- Abstract summary: Knowledge graph embedding models learn the representations of entities and relations in the knowledge graphs for predicting missing links (relations) between entities.
We show that transitivity, a very common relation pattern, is still not been fully supported by existing models.
We propose the Rot-Pro model which combines the projection and rotation together.
Experimental results show that the proposed Rot-Pro model effectively learns the transitivity pattern and achieves the state-of-the-art results on the link prediction task.
- Score: 4.9271170227460255
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge graph embedding models learn the representations of entities and
relations in the knowledge graphs for predicting missing links (relations)
between entities. Their effectiveness are deeply affected by the ability of
modeling and inferring different relation patterns such as symmetry, asymmetry,
inversion, composition and transitivity. Although existing models are already
able to model many of these relations patterns, transitivity, a very common
relation pattern, is still not been fully supported. In this paper, we first
theoretically show that the transitive relations can be modeled with
projections. We then propose the Rot-Pro model which combines the projection
and relational rotation together. We prove that Rot-Pro can infer all the above
relation patterns. Experimental results show that the proposed Rot-Pro model
effectively learns the transitivity pattern and achieves the state-of-the-art
results on the link prediction task in the datasets containing transitive
relations.
Related papers
- Relational Learning in Pre-Trained Models: A Theory from Hypergraph Recovery Perspective [60.64922606733441]
We introduce a mathematical model that formalizes relational learning as hypergraph recovery to study pre-training of Foundation Models (FMs)
In our framework, the world is represented as a hypergraph, with data abstracted as random samples from hyperedges. We theoretically examine the feasibility of a Pre-Trained Model (PTM) to recover this hypergraph and analyze the data efficiency in a minimax near-optimal style.
arXiv Detail & Related papers (2024-06-17T06:20:39Z) - Cyclic Directed Probabilistic Graphical Model: A Proposal Based on
Structured Outcomes [0.0]
We describe a probabilistic graphical model - probabilistic relation network - that allows the direct capture of directional cyclic dependencies.
This model does not violate the probability axioms, and it supports learning from observed data.
Notably, it supports probabilistic inference, making it a prospective tool in data analysis and in expert and design-making applications.
arXiv Detail & Related papers (2023-10-25T10:19:03Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Link Prediction with Attention Applied on Multiple Knowledge Graph
Embedding Models [7.620967781722715]
Knowledge graph embeddings map nodes into a vector space to predict new links, scoring them according to geometric criteria.
No single model can learn all patterns equally well.
In this paper, we combine the query representations from several models in a unified one to incorporate patterns that are independently captured by each model.
arXiv Detail & Related papers (2023-02-13T10:07:26Z) - On Neural Architecture Inductive Biases for Relational Tasks [76.18938462270503]
We introduce a simple architecture based on similarity-distribution scores which we name Compositional Network generalization (CoRelNet)
We find that simple architectural choices can outperform existing models in out-of-distribution generalizations.
arXiv Detail & Related papers (2022-06-09T16:24:01Z) - A Simple yet Effective Relation Information Guided Approach for Few-Shot
Relation Extraction [22.60428265210431]
Few-Shot Relation Extraction aims at predicting the relation for a pair of entities in a sentence by training with a few labelled examples in each relation.
Some recent works have introduced relation information to assist model learning based on Prototype Network.
We argue that relation information can be introduced more explicitly and effectively into the model.
arXiv Detail & Related papers (2022-05-19T13:03:01Z) - STaR: Knowledge Graph Embedding by Scaling, Translation and Rotation [20.297699026433065]
Bilinear method is mainstream in Knowledge Graph Embedding (KGE), aiming to learn low-dimensional representations for entities and relations.
Previous works have mainly discovered 6 important patterns like non-commutativity.
We propose a corresponding bilinear model Scaling Translation and Rotation (STaR) consisting of the above two parts.
arXiv Detail & Related papers (2022-02-15T02:06:22Z) - Unified Graph Structured Models for Video Understanding [93.72081456202672]
We propose a message passing graph neural network that explicitly models relational-temporal relations.
We show how our method is able to more effectively model relationships between relevant entities in the scene.
arXiv Detail & Related papers (2021-03-29T14:37:35Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - Relation of the Relations: A New Paradigm of the Relation Extraction
Problem [52.21210549224131]
We propose a new paradigm of Relation Extraction (RE) that considers as a whole the predictions of all relations in the same context.
We develop a data-driven approach that does not require hand-crafted rules but learns by itself the relation of relations (RoR) using Graph Neural Networks and a relation matrix transformer.
Experiments show that our model outperforms the state-of-the-art approaches by +1.12% on the ACE05 dataset and +2.55% on SemEval 2018 Task 7.2.
arXiv Detail & Related papers (2020-06-05T22:25:27Z) - LineaRE: Simple but Powerful Knowledge Graph Embedding for Link
Prediction [7.0294164380111015]
We propose a novel embedding model, namely LineaRE, which is capable of modeling four connectivity patterns and four mapping properties.
Experimental results on multiple widely used real-world datasets show that the proposed LineaRE model significantly outperforms existing state-of-the-art models for link prediction tasks.
arXiv Detail & Related papers (2020-04-21T14:19:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.