TransEdge: Translating Relation-contextualized Embeddings for Knowledge
Graphs
- URL: http://arxiv.org/abs/2004.13579v1
- Date: Wed, 22 Apr 2020 03:00:45 GMT
- Title: TransEdge: Translating Relation-contextualized Embeddings for Knowledge
Graphs
- Authors: Zequn Sun, Jiacheng Huang, Wei Hu, Muchao Chen, Lingbing Guo, Yuzhong
Qu
- Abstract summary: Learning knowledge graph embeddings have received increasing attention in recent years.
Most embedding models in literature interpret relations as linear or bilinear mapping functions to operate on entity embeddings.
We propose a novel edge-centric embedding model TransEdge, which contextualizes relation representations in terms of specific head-tail entity pairs.
- Score: 25.484805501929365
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning knowledge graph (KG) embeddings has received increasing attention in
recent years. Most embedding models in literature interpret relations as linear
or bilinear mapping functions to operate on entity embeddings. However, we find
that such relation-level modeling cannot capture the diverse relational
structures of KGs well. In this paper, we propose a novel edge-centric
embedding model TransEdge, which contextualizes relation representations in
terms of specific head-tail entity pairs. We refer to such contextualized
representations of a relation as edge embeddings and interpret them as
translations between entity embeddings. TransEdge achieves promising
performance on different prediction tasks. Our experiments on benchmark
datasets indicate that it obtains the state-of-the-art results on
embedding-based entity alignment. We also show that TransEdge is complementary
with conventional entity alignment methods. Moreover, it shows very competitive
performance on link prediction.
Related papers
- Inference over Unseen Entities, Relations and Literals on Knowledge Graphs [1.7474352892977463]
knowledge graph embedding models have been successfully applied in the transductive setting to tackle various challenging tasks.
We propose the attentive byte-pair encoding layer (BytE) to construct a triple embedding from a sequence of byte-pair encoded subword units of entities and relations.
BytE leads to massive feature reuse via weight tying, since it forces a knowledge graph embedding model to learn embeddings for subword units instead of entities and relations directly.
arXiv Detail & Related papers (2024-10-09T10:20:54Z) - Location Sensitive Embedding for Knowledge Graph Reasoning [0.0]
Key challenge in translational distance models is their inability to effectively differentiate between 'head' and 'tail' entities in graphs.
To address this problem, a novel location-sensitive embedding (LSE) method has been developed.
LSE innovatively modifies the head entity using relation-specific mappings, conceptualizing relations as linear transformations rather than mere translations.
Experiments conducted on four large-scale KG datasets for link prediction show LSEd either outperforms or is competitive with state-of-the-art related works.
arXiv Detail & Related papers (2023-12-01T22:35:19Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - TransHER: Translating Knowledge Graph Embedding with Hyper-Ellipsoidal
Restriction [14.636054717485207]
We propose a novel score function TransHER for knowledge graph embedding.
Our model first maps entities onto two separate hyper-ellipsoids and then conducts a relation-specific translation on one of them.
Experimental results show that TransHER can achieve state-of-the-art performance and generalize to datasets in different domains and scales.
arXiv Detail & Related papers (2022-04-27T22:49:27Z) - TranS: Transition-based Knowledge Graph Embedding with Synthetic
Relation Representation [14.759663752868487]
We propose a novel transition-based method, TranS, for knowledge graph embedding.
The single relation vector in traditional scoring patterns is replaced with synthetic relation representation, which can solve these issues effectively and efficiently.
Experiments on a large knowledge graph dataset, ogbl-wikikg2, show that our model achieves state-of-the-art results.
arXiv Detail & Related papers (2022-04-18T16:55:25Z) - Entailment Graph Learning with Textual Entailment and Soft Transitivity [69.91691115264132]
We propose a two-stage method, Entailment Graph with Textual Entailment and Transitivity (EGT2)
EGT2 learns local entailment relations by recognizing possible textual entailment between template sentences formed by CCG-parsed predicates.
Based on the generated local graph, EGT2 then uses three novel soft transitivity constraints to consider the logical transitivity in entailment structures.
arXiv Detail & Related papers (2022-04-07T08:33:06Z) - Transformer-based Dual Relation Graph for Multi-label Image Recognition [56.12543717723385]
We propose a novel Transformer-based Dual Relation learning framework.
We explore two aspects of correlation, i.e., structural relation graph and semantic relation graph.
Our approach achieves new state-of-the-art on two popular multi-label recognition benchmarks.
arXiv Detail & Related papers (2021-10-10T07:14:52Z) - RatE: Relation-Adaptive Translating Embedding for Knowledge Graph
Completion [51.64061146389754]
We propose a relation-adaptive translation function built upon a novel weighted product in complex space.
We then present our Relation-adaptive translating Embedding (RatE) approach to score each graph triple.
arXiv Detail & Related papers (2020-10-10T01:30:30Z) - TransINT: Embedding Implication Rules in Knowledge Graphs with
Isomorphic Intersections of Linear Subspaces [10.79392871079383]
We propose TransINT, a novel embedding method for Knowledge Graphs.
TransINT maps entities (tied by a relation) to continuous sets of vectors that are inclusion-ordered isomorphically to relation implications.
On a benchmark dataset, we outperform the best existing state-of-the-art rule integration embedding methods with significant margins in link Prediction and triple Classification.
arXiv Detail & Related papers (2020-07-01T06:45:27Z) - Relational Message Passing for Knowledge Graph Completion [78.47976646383222]
We propose a relational message passing method for knowledge graph completion.
It passes relational messages among edges iteratively to aggregate neighborhood information.
Results show our method outperforms stateof-the-art knowledge completion methods by a large margin.
arXiv Detail & Related papers (2020-02-17T03:33:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.