How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a
Semantic Evidence View
- URL: http://arxiv.org/abs/2109.11800v1
- Date: Fri, 24 Sep 2021 08:17:02 GMT
- Title: How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a
Semantic Evidence View
- Authors: Ren Li, Yanan Cao, Qiannan Zhu, Guanqun Bi, Fang Fang, Yi Liu, Qian Li
- Abstract summary: We study how does Knowledge Graph Embedding (KGE) extrapolate to unseen data.
We also propose a novel GNN-based KGE model, called Semantic Evidence aware Graph Neural Network (SE-GNN)
- Score: 13.575052133743505
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge Graph Embedding (KGE) aims to learn representations for entities
and relations. Most KGE models have gained great success, especially on
extrapolation scenarios. Specifically, given an unseen triple (h, r, t), a
trained model can still correctly predict t from (h, r, ?), or h from (?, r,
t), such extrapolation ability is impressive. However, most existing KGE works
focus on the design of delicate triple modeling function, which mainly tell us
how to measure the plausibility of observed triples, but we have limited
understanding of why the methods can extrapolate to unseen data, and what are
the important factors to help KGE extrapolate. Therefore in this work, we
attempt to, from a data relevant view, study KGE extrapolation of two problems:
1. How does KGE extrapolate to unseen data? 2. How to design the KGE model with
better extrapolation ability? For the problem 1, we first discuss the impact
factors for extrapolation and from relation, entity and triple level
respectively, propose three Semantic Evidences (SEs), which can be observed
from training set and provide important semantic information for extrapolation
to unseen data. Then we verify the effectiveness of SEs through extensive
experiments on several typical KGE methods, and demonstrate that SEs serve as
an important role for understanding the extrapolation ability of KGE. For the
problem 2, to make better use of the SE information for more extrapolative
knowledge representation, we propose a novel GNN-based KGE model, called
Semantic Evidence aware Graph Neural Network (SE-GNN). Finally, through
extensive experiments on FB15k-237 and WN18RR datasets, we show that SE-GNN
achieves state-of-the-art performance on Knowledge Graph Completion task and
perform a better extrapolation ability.
Related papers
- KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and
Knowledge Distillation [6.332573781489264]
We present KGEx, a novel method that explains individual link predictions by drawing inspiration from surrogate models research.
Given a target triple to predict, KGEx trains surrogate KGE models that we use to identify important training triples.
We conduct extensive experiments on two publicly available datasets, to demonstrate that KGEx is capable of providing explanations faithful to the black-box model.
arXiv Detail & Related papers (2023-10-02T10:20:24Z) - A Comprehensive Study on Knowledge Graph Embedding over Relational
Patterns Based on Rule Learning [49.09125100268454]
Knowledge Graph Embedding (KGE) has proven to be an effective approach to solving the Knowledge Completion Graph (KGC) task.
Relational patterns are an important factor in the performance of KGE models.
We introduce a training-free method to enhance KGE models' performance over various relational patterns.
arXiv Detail & Related papers (2023-08-15T17:30:57Z) - ExpressivE: A Spatio-Functional Embedding For Knowledge Graph Completion [78.8942067357231]
ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space.
We show that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on W18RR.
arXiv Detail & Related papers (2022-06-08T23:34:39Z) - KQGC: Knowledge Graph Embedding with Smoothing Effects of Graph
Convolutions for Recommendation [3.264007084815591]
We propose a new model for recommender systems named Knowledge Query-based Graph Convolution (KQGC)
KQGC focuses on the smoothing, and leverages a simple linear graph convolution for smoothing KGE.
We apply the proposed KQGC to a recommendation task that aims prospective users for specific products.
arXiv Detail & Related papers (2022-05-23T09:34:06Z) - CAKE: A Scalable Commonsense-Aware Framework For Multi-View Knowledge
Graph Completion [43.172893405453266]
Previous knowledge graph embedding techniques suffer from invalid negative sampling and the uncertainty of fact-view link prediction.
We propose a novel and scalable Commonsense-Aware Knowledge Embedding (CAKE) framework to automatically extract commonsense from factual triples with entity concepts.
arXiv Detail & Related papers (2022-02-25T03:30:22Z) - Detecting Owner-member Relationship with Graph Convolution Network in
Fisheye Camera System [9.665475078766017]
We propose an innovative relationship prediction method, DeepWORD, by designing a graph convolutional network (GCN)
In the experiments we learned that the proposed method achieved state-of-the-art accuracy and real-time performance.
arXiv Detail & Related papers (2022-01-28T13:12:27Z) - Towards Open-World Feature Extrapolation: An Inductive Graph Learning
Approach [80.8446673089281]
We propose a new learning paradigm with graph representation and learning.
Our framework contains two modules: 1) a backbone network (e.g., feedforward neural nets) as a lower model takes features as input and outputs predicted labels; 2) a graph neural network as an upper model learns to extrapolate embeddings for new features via message passing over a feature-data graph built from observed data.
arXiv Detail & Related papers (2021-10-09T09:02:45Z) - How Knowledge Graph and Attention Help? A Quantitative Analysis into
Bag-level Relation Extraction [66.09605613944201]
We quantitatively evaluate the effect of attention and Knowledge Graph on bag-level relation extraction (RE)
We find that (1) higher attention accuracy may lead to worse performance as it may harm the model's ability to extract entity mention features; (2) the performance of attention is largely influenced by various noise distribution patterns; and (3) KG-enhanced attention indeed improves RE performance, while not through enhanced attention but by incorporating entity prior.
arXiv Detail & Related papers (2021-07-26T09:38:28Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - An Evaluation of Knowledge Graph Embeddings for Autonomous Driving Data:
Experience and Practice [0.0]
The autonomous driving (AD) industry is exploring the use of knowledge graphs (KGs) to manage the vast amount of heterogeneous data generated from vehicular sensors.
Recent work on knowledge graph embeddings (KGEs) has shown to improve the predictive performance of machine learning models.
This research explores the generation and evaluation of KGEs for autonomous driving data.
arXiv Detail & Related papers (2020-02-29T20:33:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.