Learning Representations of Entities and Relations
- URL: http://arxiv.org/abs/2201.13073v1
- Date: Mon, 31 Jan 2022 09:24:43 GMT
- Title: Learning Representations of Entities and Relations
- Authors: Ivana Bala\v{z}evi\'c
- Abstract summary: This thesis focuses on improving knowledge graph representation with the aim of tackling the link prediction task.
The first contribution is HypER, a convolutional model which simplifies and improves upon the link prediction performance.
The second contribution is TuckER, a relatively straightforward linear model, which, at the time of its introduction, obtained state-of-the-art link prediction performance.
The third contribution is MuRP, first multi-relational graph representation model embedded in hyperbolic space.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Encoding facts as representations of entities and binary relationships
between them, as learned by knowledge graph representation models, is useful
for various tasks, including predicting new facts, question answering, fact
checking and information retrieval. The focus of this thesis is on (i)
improving knowledge graph representation with the aim of tackling the link
prediction task; and (ii) devising a theory on how semantics can be captured in
the geometry of relation representations. Most knowledge graphs are very
incomplete and manually adding new information is costly, which drives the
development of methods which can automatically infer missing facts. The first
contribution of this thesis is HypER, a convolutional model which simplifies
and improves upon the link prediction performance of the existing convolutional
state-of-the-art model ConvE and can be mathematically explained in terms of
constrained tensor factorisation. The second contribution is TuckER, a
relatively straightforward linear model, which, at the time of its
introduction, obtained state-of-the-art link prediction performance across
standard datasets. The third contribution is MuRP, first multi-relational graph
representation model embedded in hyperbolic space. MuRP outperforms all
existing models and its Euclidean counterpart MuRE in link prediction on
hierarchical knowledge graph relations whilst requiring far fewer dimensions.
Despite the development of a large number of knowledge graph representation
models with gradually increasing predictive performance, relatively little is
known of the latent structure they learn. We generalise recent theoretical
understanding of how semantic relations of similarity, paraphrase and analogy
are encoded in the geometric interactions of word embeddings to how more
general relations, as found in knowledge graphs, can be encoded in their
representations.
Related papers
- Graph-Dictionary Signal Model for Sparse Representations of Multivariate Data [49.77103348208835]
We define a novel Graph-Dictionary signal model, where a finite set of graphs characterizes relationships in data distribution through a weighted sum of their Laplacians.
We propose a framework to infer the graph dictionary representation from observed data, along with a bilinear generalization of the primal-dual splitting algorithm to solve the learning problem.
We exploit graph-dictionary representations in a motor imagery decoding task on brain activity data, where we classify imagined motion better than standard methods.
arXiv Detail & Related papers (2024-11-08T17:40:43Z) - Explainable Representations for Relation Prediction in Knowledge Graphs [0.0]
We propose SEEK, a novel approach for explainable representations to support relation prediction in knowledge graphs.
It is based on identifying relevant shared semantic aspects between entities and learning representations for each subgraph.
We evaluate SEEK on two real-world relation prediction tasks: protein-protein interaction prediction and gene-disease association prediction.
arXiv Detail & Related papers (2023-06-22T06:18:40Z) - Beyond spectral gap (extended): The role of the topology in
decentralized learning [58.48291921602417]
In data-parallel optimization of machine learning models, workers collaborate to improve their estimates of the model.
Current theory does not explain that collaboration enables larger learning rates than training alone.
This paper aims to paint an accurate picture of sparsely-connected distributed optimization.
arXiv Detail & Related papers (2023-01-05T16:53:38Z) - Beyond spectral gap: The role of the topology in decentralized learning [58.48291921602417]
In data-parallel optimization of machine learning models, workers collaborate to improve their estimates of the model.
This paper aims to paint an accurate picture of sparsely-connected distributed optimization when workers share the same data distribution.
Our theory matches empirical observations in deep learning, and accurately describes the relative merits of different graph topologies.
arXiv Detail & Related papers (2022-06-07T08:19:06Z) - KGRefiner: Knowledge Graph Refinement for Improving Accuracy of
Translational Link Prediction Methods [4.726777092009553]
This paper proposes a method for refining the knowledge graph.
It makes the knowledge graph more informative, and link prediction operations can be performed more accurately.
Our experiments show that our method can significantly increase the performance of translational link prediction methods.
arXiv Detail & Related papers (2021-06-27T13:32:39Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - Knowledge Hypergraph Embedding Meets Relational Algebra [13.945694569456665]
We propose a simple embedding-based model called ReAlE that performs link prediction in knowledge hypergraphs.
We show theoretically that ReAlE is fully expressive and provide proofs and empirical evidence that it can represent a large subset of the primitive relational algebra operations.
arXiv Detail & Related papers (2021-02-18T18:57:44Z) - Hyperbolic Graph Embedding with Enhanced Semi-Implicit Variational
Inference [48.63194907060615]
We build off of semi-implicit graph variational auto-encoders to capture higher-order statistics in a low-dimensional graph latent representation.
We incorporate hyperbolic geometry in the latent space through a Poincare embedding to efficiently represent graphs exhibiting hierarchical structure.
arXiv Detail & Related papers (2020-10-31T05:48:34Z) - LineaRE: Simple but Powerful Knowledge Graph Embedding for Link
Prediction [7.0294164380111015]
We propose a novel embedding model, namely LineaRE, which is capable of modeling four connectivity patterns and four mapping properties.
Experimental results on multiple widely used real-world datasets show that the proposed LineaRE model significantly outperforms existing state-of-the-art models for link prediction tasks.
arXiv Detail & Related papers (2020-04-21T14:19:43Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.