5* Knowledge Graph Embeddings with Projective Transformations
- URL: http://arxiv.org/abs/2006.04986v2
- Date: Sun, 14 Mar 2021 16:46:48 GMT
- Title: 5* Knowledge Graph Embeddings with Projective Transformations
- Authors: Mojtaba Nayyeri, Sahar Vahdati, Can Aykul, Jens Lehmann
- Abstract summary: We present a novel knowledge graph embedding model (5*E) in projective geometry.
It supports multiple simultaneous transformations - specifically inversion, reflection, translation, rotation, and homothety.
It outperforms existing approaches on the most widely used link prediction benchmarks.
- Score: 13.723120574076127
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Performing link prediction using knowledge graph embedding models has become
a popular approach for knowledge graph completion. Such models employ a
transformation function that maps nodes via edges into a vector space in order
to measure the likelihood of the links. While mapping the individual nodes, the
structure of subgraphs is also transformed. Most of the embedding models
designed in Euclidean geometry usually support a single transformation type -
often translation or rotation, which is suitable for learning on graphs with
small differences in neighboring subgraphs. However, multi-relational knowledge
graphs often include multiple sub-graph structures in a neighborhood (e.g.
combinations of path and loop structures), which current embedding models do
not capture well. To tackle this problem, we propose a novel KGE model (5*E) in
projective geometry, which supports multiple simultaneous transformations -
specifically inversion, reflection, translation, rotation, and homothety. The
model has several favorable theoretical properties and subsumes the existing
approaches. It outperforms them on the most widely used link prediction
benchmarks
Related papers
- Curve Your Attention: Mixed-Curvature Transformers for Graph
Representation Learning [77.1421343649344]
We propose a generalization of Transformers towards operating entirely on the product of constant curvature spaces.
We also provide a kernelized approach to non-Euclidean attention, which enables our model to run in time and memory cost linear to the number of nodes and edges.
arXiv Detail & Related papers (2023-09-08T02:44:37Z) - Discrete Graph Auto-Encoder [52.50288418639075]
We introduce a new framework named Discrete Graph Auto-Encoder (DGAE)
We first use a permutation-equivariant auto-encoder to convert graphs into sets of discrete latent node representations.
In the second step, we sort the sets of discrete latent representations and learn their distribution with a specifically designed auto-regressive model.
arXiv Detail & Related papers (2023-06-13T12:40:39Z) - You Only Transfer What You Share: Intersection-Induced Graph Transfer
Learning for Link Prediction [79.15394378571132]
We investigate a previously overlooked phenomenon: in many cases, a densely connected, complementary graph can be found for the original graph.
The denser graph may share nodes with the original graph, which offers a natural bridge for transferring selective, meaningful knowledge.
We identify this setting as Graph Intersection-induced Transfer Learning (GITL), which is motivated by practical applications in e-commerce or academic co-authorship predictions.
arXiv Detail & Related papers (2023-02-27T22:56:06Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - Latent Graph Inference using Product Manifolds [0.0]
We generalize the discrete Differentiable Graph Module (dDGM) for latent graph learning.
Our novel approach is tested on a wide range of datasets, and outperforms the original dDGM model.
arXiv Detail & Related papers (2022-11-26T22:13:06Z) - BiQUE: Biquaternionic Embeddings of Knowledge Graphs [9.107095800991333]
Existing knowledge graph embeddings (KGEs) compactly encode multi-relational knowledge graphs (KGs)
It is crucial for KGE models to unify multiple geometric transformations so as to fully cover the multifarious relations in KGs.
We propose BiQUE, a novel model that employs biquaternions to integrate multiple geometric transformations.
arXiv Detail & Related papers (2021-09-29T13:05:32Z) - Self-Supervised Graph Representation Learning via Topology
Transformations [61.870882736758624]
We present the Topology Transformation Equivariant Representation learning, a general paradigm of self-supervised learning for node representations of graph data.
In experiments, we apply the proposed model to the downstream node and graph classification tasks, and results show that the proposed method outperforms the state-of-the-art unsupervised approaches.
arXiv Detail & Related papers (2021-05-25T06:11:03Z) - Beyond permutation equivariance in graph networks [1.713291434132985]
We introduce a novel architecture for graph networks which is equivariant to the Euclidean group in $n$-dimensions.
Our model is designed to work with graph networks in their most general form, thus including particular variants as special cases.
arXiv Detail & Related papers (2021-03-25T18:36:09Z) - Motif Learning in Knowledge Graphs Using Trajectories Of Differential
Equations [14.279419014064047]
Knowledge Graph Embeddings (KGEs) have shown promising performance on link prediction tasks.
Many KGEs use the flat geometry which renders them incapable of preserving complex structures.
We propose a neuro differential KGE that embeds nodes of a KG on the trajectories of Ordinary Differential Equations (ODEs)
arXiv Detail & Related papers (2020-10-13T20:53:17Z) - LineaRE: Simple but Powerful Knowledge Graph Embedding for Link
Prediction [7.0294164380111015]
We propose a novel embedding model, namely LineaRE, which is capable of modeling four connectivity patterns and four mapping properties.
Experimental results on multiple widely used real-world datasets show that the proposed LineaRE model significantly outperforms existing state-of-the-art models for link prediction tasks.
arXiv Detail & Related papers (2020-04-21T14:19:43Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.