What is Learned in Knowledge Graph Embeddings?
- URL: http://arxiv.org/abs/2110.09978v1
- Date: Tue, 19 Oct 2021 13:52:11 GMT
- Title: What is Learned in Knowledge Graph Embeddings?
- Authors: Michael R. Douglas, Michael Simkin, Omri Ben-Eliezer, Tianqi Wu, Peter
Chin, Trung V. Dang and Andrew Wood
- Abstract summary: A knowledge graph (KG) is a data structure which represents entities and relations as the vertices and edges of a directed graph with edge types.
We investigate whether learning rules between relations is indeed what drives the performance of embedding-based methods.
Using experiments on synthetic KGs, we show that KG models can learn motifs and how this ability is degraded by non-motif edges.
- Score: 3.224929252256631
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A knowledge graph (KG) is a data structure which represents entities and
relations as the vertices and edges of a directed graph with edge types. KGs
are an important primitive in modern machine learning and artificial
intelligence. Embedding-based models, such as the seminal TransE [Bordes et
al., 2013] and the recent PairRE [Chao et al., 2020] are among the most popular
and successful approaches for representing KGs and inferring missing edges
(link completion). Their relative success is often credited in the literature
to their ability to learn logical rules between the relations.
In this work, we investigate whether learning rules between relations is
indeed what drives the performance of embedding-based methods. We define motif
learning and two alternative mechanisms, network learning (based only on the
connectivity of the KG, ignoring the relation types), and unstructured
statistical learning (ignoring the connectivity of the graph). Using
experiments on synthetic KGs, we show that KG models can learn motifs and how
this ability is degraded by non-motif (noise) edges. We propose tests to
distinguish the contributions of the three mechanisms to performance, and apply
them to popular KG benchmarks. We also discuss an issue with the standard
performance testing protocol and suggest an improvement.
To appear in the proceedings of Complex Networks 2021.
Related papers
- EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph
Completion [54.12709176438264]
Commonsense knowledge graphs (CSKGs) utilize free-form text to represent named entities, short phrases, and events as their nodes.
Current methods leverage semantic similarities to increase the graph density, but the semantic plausibility of the nodes and their relations are under-explored.
We propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class.
arXiv Detail & Related papers (2024-02-15T02:27:23Z) - Knowledge Graph Embedding: An Overview [42.16033541753744]
We make a comprehensive overview of the current state of research in Knowledge Graph completion.
We focus on two main branches of KG embedding (KGE) design: 1) distance-based methods and 2) semantic matching-based methods.
Next, we delve into CompoundE and CompoundE3D, which draw inspiration from 2D and 3D affine operations.
arXiv Detail & Related papers (2023-09-21T21:52:42Z) - A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic,
and Multimodal [57.8455911689554]
Knowledge graph reasoning (KGR) aims to deduce new facts from existing facts based on mined logic rules underlying knowledge graphs (KGs)
It has been proven to significantly benefit the usage of KGs in many AI applications, such as question answering, recommendation systems, and etc.
arXiv Detail & Related papers (2022-12-12T08:40:04Z) - I Know What You Do Not Know: Knowledge Graph Embedding via
Co-distillation Learning [16.723470319188102]
Knowledge graph embedding seeks to learn vector representations for entities and relations.
Recent studies have used pre-trained language models to learn embeddings based on the textual information of entities and relations.
We propose CoLE, a Co-distillation Learning method for KG Embedding that exploits the complement of graph structures and text information.
arXiv Detail & Related papers (2022-08-21T07:34:37Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - DegreEmbed: incorporating entity embedding into logic rule learning for
knowledge graph reasoning [7.066269573204757]
Link prediction for knowledge graphs is the task aiming to complete missing facts by reasoning based on the existing knowledge.
We propose DegreEmbed, a model that combines embedding-based learning and logic rule mining for inferring on KGs.
arXiv Detail & Related papers (2021-12-18T13:38:48Z) - EngineKGI: Closed-Loop Knowledge Graph Inference [37.15381932994768]
EngineKGI is a novel closed-loop KG inference framework.
It combines KGE and rule learning to complement each other in a closed-loop pattern.
Our model outperforms other baselines on link prediction tasks.
arXiv Detail & Related papers (2021-12-02T08:02:59Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.