Knowledge Hypergraph Embedding Meets Relational Algebra
- URL: http://arxiv.org/abs/2102.09557v1
- Date: Thu, 18 Feb 2021 18:57:44 GMT
- Title: Knowledge Hypergraph Embedding Meets Relational Algebra
- Authors: Bahare Fatemi, Perouz Taslakian, David Vazquez, David Poole
- Abstract summary: We propose a simple embedding-based model called ReAlE that performs link prediction in knowledge hypergraphs.
We show theoretically that ReAlE is fully expressive and provide proofs and empirical evidence that it can represent a large subset of the primitive relational algebra operations.
- Score: 13.945694569456665
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Embedding-based methods for reasoning in knowledge hypergraphs learn a
representation for each entity and relation. Current methods do not capture the
procedural rules underlying the relations in the graph. We propose a simple
embedding-based model called ReAlE that performs link prediction in knowledge
hypergraphs (generalized knowledge graphs) and can represent high-level
abstractions in terms of relational algebra operations. We show theoretically
that ReAlE is fully expressive and provide proofs and empirical evidence that
it can represent a large subset of the primitive relational algebra operations,
namely renaming, projection, set union, selection, and set difference. We also
verify experimentally that ReAlE outperforms state-of-the-art models in
knowledge hypergraph completion, and in representing each of these primitive
relational algebra operations. For the latter experiment, we generate a
synthetic knowledge hypergraph, for which we design an algorithm based on the
Erdos-R'enyi model for generating random graphs.
Related papers
- Graph-Dictionary Signal Model for Sparse Representations of Multivariate Data [49.77103348208835]
We define a novel Graph-Dictionary signal model, where a finite set of graphs characterizes relationships in data distribution through a weighted sum of their Laplacians.
We propose a framework to infer the graph dictionary representation from observed data, along with a bilinear generalization of the primal-dual splitting algorithm to solve the learning problem.
We exploit graph-dictionary representations in a motor imagery decoding task on brain activity data, where we classify imagined motion better than standard methods.
arXiv Detail & Related papers (2024-11-08T17:40:43Z) - Relational Learning in Pre-Trained Models: A Theory from Hypergraph Recovery Perspective [60.64922606733441]
We introduce a mathematical model that formalizes relational learning as hypergraph recovery to study pre-training of Foundation Models (FMs)
In our framework, the world is represented as a hypergraph, with data abstracted as random samples from hyperedges. We theoretically examine the feasibility of a Pre-Trained Model (PTM) to recover this hypergraph and analyze the data efficiency in a minimax near-optimal style.
arXiv Detail & Related papers (2024-06-17T06:20:39Z) - Learning from Heterogeneity: A Dynamic Learning Framework for Hypergraphs [22.64740740462169]
We propose a hypergraph learning framework named LFH that is capable of dynamic hyperedge construction and attentive embedding update.
To evaluate the effectiveness of our proposed framework, we conduct comprehensive experiments on several popular datasets.
arXiv Detail & Related papers (2023-07-07T06:26:44Z) - From axioms over graphs to vectors, and back again: evaluating the
properties of graph-based ontology embeddings [78.217418197549]
One approach to generating embeddings is by introducing a set of nodes and edges for named entities and logical axioms structure.
Methods that embed in graphs (graph projections) have different properties related to the type of axioms they utilize.
arXiv Detail & Related papers (2023-03-29T08:21:49Z) - Repurposing Knowledge Graph Embeddings for Triple Representation via
Weak Supervision [77.34726150561087]
Current methods learn triple embeddings from scratch without utilizing entity and predicate embeddings from pre-trained models.
We develop a method for automatically sampling triples from a knowledge graph and estimating their pairwise similarities from pre-trained embedding models.
These pairwise similarity scores are then fed to a Siamese-like neural architecture to fine-tune triple representations.
arXiv Detail & Related papers (2022-08-22T14:07:08Z) - Knowledgebra: An Algebraic Learning Framework for Knowledge Graph [15.235089177507897]
Knowledge graph (KG) representation learning aims to encode entities and relations into dense continuous vector spaces such that knowledge contained in a dataset could be consistently represented.
We developed a mathematical language for KG based on an observation of their inherent algebraic structure, which we termed as Knowledgebra.
We implemented an instantiation model, SemE, using simple matrix semigroups, which exhibits state-of-the-art performance on standard datasets.
arXiv Detail & Related papers (2022-04-15T04:53:47Z) - Learning Representations of Entities and Relations [0.0]
This thesis focuses on improving knowledge graph representation with the aim of tackling the link prediction task.
The first contribution is HypER, a convolutional model which simplifies and improves upon the link prediction performance.
The second contribution is TuckER, a relatively straightforward linear model, which, at the time of its introduction, obtained state-of-the-art link prediction performance.
The third contribution is MuRP, first multi-relational graph representation model embedded in hyperbolic space.
arXiv Detail & Related papers (2022-01-31T09:24:43Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - HyperSAGE: Generalizing Inductive Representation Learning on Hypergraphs [24.737560790401314]
We present HyperSAGE, a novel hypergraph learning framework that uses a two-level neural message passing strategy to accurately and efficiently propagate information through hypergraphs.
We show that HyperSAGE outperforms state-of-the-art hypergraph learning methods on representative benchmark datasets.
arXiv Detail & Related papers (2020-10-09T13:28:06Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.