TranS: Transition-based Knowledge Graph Embedding with Synthetic
Relation Representation
- URL: http://arxiv.org/abs/2204.08401v1
- Date: Mon, 18 Apr 2022 16:55:25 GMT
- Title: TranS: Transition-based Knowledge Graph Embedding with Synthetic
Relation Representation
- Authors: Xuanyu Zhang, Qing Yang and Dongliang Xu
- Abstract summary: We propose a novel transition-based method, TranS, for knowledge graph embedding.
The single relation vector in traditional scoring patterns is replaced with synthetic relation representation, which can solve these issues effectively and efficiently.
Experiments on a large knowledge graph dataset, ogbl-wikikg2, show that our model achieves state-of-the-art results.
- Score: 14.759663752868487
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graph embedding (KGE) aims to learn continuous vectors of relations
and entities in knowledge graph. Recently, transition-based KGE methods have
achieved promising performance, where the single relation vector learns to
translate head entity to tail entity. However, this scoring pattern is not
suitable for complex scenarios where the same entity pair has different
relations. Previous models usually focus on the improvement of entity
representation for 1-to-N, N-to-1 and N-to-N relations, but ignore the single
relation vector. In this paper, we propose a novel transition-based method,
TranS, for knowledge graph embedding. The single relation vector in traditional
scoring patterns is replaced with synthetic relation representation, which can
solve these issues effectively and efficiently. Experiments on a large
knowledge graph dataset, ogbl-wikikg2, show that our model achieves
state-of-the-art results.
Related papers
- A Condensed Transition Graph Framework for Zero-shot Link Prediction
with Large Language Models [22.089751438495956]
We introduce a Condensed Transition Graph Framework for Zero-Shot Link Prediction (CTLP)
CTLP encodes all the paths' information in linear time complexity to predict unseen relations between entities.
Our proposed CTLP method achieves state-of-the-art performance on three standard ZSLP datasets.
arXiv Detail & Related papers (2024-02-16T16:02:33Z) - zrLLM: Zero-Shot Relational Learning on Temporal Knowledge Graphs with Large Language Models [33.10218179341504]
We use large language models to generate relation representations for embedding-based TKGF methods.
We show that our approach helps TKGF models to achieve much better performance in forecasting the facts with previously unseen relations.
arXiv Detail & Related papers (2023-11-15T21:25:15Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - Efficient Relation-aware Neighborhood Aggregation in Graph Neural Networks via Tensor Decomposition [4.041834517339835]
We propose a novel knowledge graph that incorporates tensor decomposition within the aggregation function of Graph Conalvolution Network (R-GCN)
Our model enhances the representation of neighboring entities by employing projection matrices of a low-rank tensor defined by relation types.
We adopt a training strategy inspired by contrastive learning to relieve the training limitation of the 1-k-k encoder method inherent in handling vast graphs.
arXiv Detail & Related papers (2022-12-11T19:07:34Z) - TransHER: Translating Knowledge Graph Embedding with Hyper-Ellipsoidal
Restriction [14.636054717485207]
We propose a novel score function TransHER for knowledge graph embedding.
Our model first maps entities onto two separate hyper-ellipsoids and then conducts a relation-specific translation on one of them.
Experimental results show that TransHER can achieve state-of-the-art performance and generalize to datasets in different domains and scales.
arXiv Detail & Related papers (2022-04-27T22:49:27Z) - STaR: Knowledge Graph Embedding by Scaling, Translation and Rotation [20.297699026433065]
Bilinear method is mainstream in Knowledge Graph Embedding (KGE), aiming to learn low-dimensional representations for entities and relations.
Previous works have mainly discovered 6 important patterns like non-commutativity.
We propose a corresponding bilinear model Scaling Translation and Rotation (STaR) consisting of the above two parts.
arXiv Detail & Related papers (2022-02-15T02:06:22Z) - Tensor Composition Net for Visual Relationship Prediction [115.14829858763399]
We present a novel Composition Network (TCN) to predict visual relationships in images.
The key idea of our TCN is to exploit the low rank property of the visual relationship tensor.
We show our TCN's image-level visual relationship prediction provides a simple and efficient mechanism for relation-based image retrieval.
arXiv Detail & Related papers (2020-12-10T06:27:20Z) - Learning Relation Prototype from Unlabeled Texts for Long-tail Relation
Extraction [84.64435075778988]
We propose a general approach to learn relation prototypes from unlabeled texts.
We learn relation prototypes as an implicit factor between entities.
We conduct experiments on two publicly available datasets: New York Times and Google Distant Supervision.
arXiv Detail & Related papers (2020-11-27T06:21:12Z) - HittER: Hierarchical Transformers for Knowledge Graph Embeddings [85.93509934018499]
We propose Hitt to learn representations of entities and relations in a complex knowledge graph.
Experimental results show that Hitt achieves new state-of-the-art results on multiple link prediction.
We additionally propose a simple approach to integrate Hitt into BERT and demonstrate its effectiveness on two Freebase factoid answering datasets.
arXiv Detail & Related papers (2020-08-28T18:58:15Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.