Knowledgebra: An Algebraic Learning Framework for Knowledge Graph
- URL: http://arxiv.org/abs/2204.07328v1
- Date: Fri, 15 Apr 2022 04:53:47 GMT
- Title: Knowledgebra: An Algebraic Learning Framework for Knowledge Graph
- Authors: Tong Yang, Yifei Wang, Long Sha, Jan Engelbrecht, Pengyu Hong
- Abstract summary: Knowledge graph (KG) representation learning aims to encode entities and relations into dense continuous vector spaces such that knowledge contained in a dataset could be consistently represented.
We developed a mathematical language for KG based on an observation of their inherent algebraic structure, which we termed as Knowledgebra.
We implemented an instantiation model, SemE, using simple matrix semigroups, which exhibits state-of-the-art performance on standard datasets.
- Score: 15.235089177507897
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Knowledge graph (KG) representation learning aims to encode entities and
relations into dense continuous vector spaces such that knowledge contained in
a dataset could be consistently represented. Dense embeddings trained from KG
datasets benefit a variety of downstream tasks such as KG completion and link
prediction. However, existing KG embedding methods fell short to provide a
systematic solution for the global consistency of knowledge representation. We
developed a mathematical language for KG based on an observation of their
inherent algebraic structure, which we termed as Knowledgebra. By analyzing
five distinct algebraic properties, we proved that the semigroup is the most
reasonable algebraic structure for the relation embedding of a general
knowledge graph. We implemented an instantiation model, SemE, using simple
matrix semigroups, which exhibits state-of-the-art performance on standard
datasets. Moreover, we proposed a regularization-based method to integrate
chain-like logic rules derived from human knowledge into embedding training,
which further demonstrates the power of the developed language. As far as we
know, by applying abstract algebra in statistical learning, this work develops
the first formal language for general knowledge graphs, and also sheds light on
the problem of neural-symbolic integration from an algebraic perspective.
Related papers
- Knowledge Graph Completion using Structural and Textual Embeddings [0.0]
We propose a relations prediction model that harnesses both textual and structural information within Knowledge Graphs.
Our approach integrates walks-based embeddings with language model embeddings to effectively represent nodes.
We demonstrate that our model achieves competitive results in the relation prediction task when evaluated on a widely used dataset.
arXiv Detail & Related papers (2024-04-24T21:04:14Z) - Recognizing Unseen Objects via Multimodal Intensive Knowledge Graph
Propagation [68.13453771001522]
We propose a multimodal intensive ZSL framework that matches regions of images with corresponding semantic embeddings.
We conduct extensive experiments and evaluate our model on large-scale real-world data.
arXiv Detail & Related papers (2023-06-14T13:07:48Z) - Joint Language Semantic and Structure Embedding for Knowledge Graph
Completion [66.15933600765835]
We propose to jointly embed the semantics in the natural language description of the knowledge triplets with their structure information.
Our method embeds knowledge graphs for the completion task via fine-tuning pre-trained language models.
Our experiments on a variety of knowledge graph benchmarks have demonstrated the state-of-the-art performance of our method.
arXiv Detail & Related papers (2022-09-19T02:41:02Z) - Knowledge Graph Augmented Network Towards Multiview Representation
Learning for Aspect-based Sentiment Analysis [96.53859361560505]
We propose a knowledge graph augmented network (KGAN) to incorporate external knowledge with explicitly syntactic and contextual information.
KGAN captures the sentiment feature representations from multiple perspectives, i.e., context-, syntax- and knowledge-based.
Experiments on three popular ABSA benchmarks demonstrate the effectiveness and robustness of our KGAN.
arXiv Detail & Related papers (2022-01-13T08:25:53Z) - DegreEmbed: incorporating entity embedding into logic rule learning for
knowledge graph reasoning [7.066269573204757]
Link prediction for knowledge graphs is the task aiming to complete missing facts by reasoning based on the existing knowledge.
We propose DegreEmbed, a model that combines embedding-based learning and logic rule mining for inferring on KGs.
arXiv Detail & Related papers (2021-12-18T13:38:48Z) - Learning Algebraic Representation for Systematic Generalization in
Abstract Reasoning [109.21780441933164]
We propose a hybrid approach to improve systematic generalization in reasoning.
We showcase a prototype with algebraic representation for the abstract spatial-temporal task of Raven's Progressive Matrices (RPM)
We show that the algebraic representation learned can be decoded by isomorphism to generate an answer.
arXiv Detail & Related papers (2021-11-25T09:56:30Z) - Knowledge Hypergraph Embedding Meets Relational Algebra [13.945694569456665]
We propose a simple embedding-based model called ReAlE that performs link prediction in knowledge hypergraphs.
We show theoretically that ReAlE is fully expressive and provide proofs and empirical evidence that it can represent a large subset of the primitive relational algebra operations.
arXiv Detail & Related papers (2021-02-18T18:57:44Z) - Knowledge Graph Embeddings in Geometric Algebras [14.269860621624392]
We introduce a novel geometric algebra-based KG embedding framework, GeomE.
Our framework subsumes several state-of-the-art KG embedding approaches and is advantageous with its ability of modeling various key relation patterns.
Experimental results on multiple benchmark knowledge graphs show that theproposed approach outperforms existing state-of-the-art models for link prediction.
arXiv Detail & Related papers (2020-10-02T13:36:24Z) - JAKET: Joint Pre-training of Knowledge Graph and Language Understanding [73.43768772121985]
We propose a novel joint pre-training framework, JAKET, to model both the knowledge graph and language.
The knowledge module and language module provide essential information to mutually assist each other.
Our design enables the pre-trained model to easily adapt to unseen knowledge graphs in new domains.
arXiv Detail & Related papers (2020-10-02T05:53:36Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.