KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and
Knowledge Distillation
- URL: http://arxiv.org/abs/2310.01065v1
- Date: Mon, 2 Oct 2023 10:20:24 GMT
- Title: KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and
Knowledge Distillation
- Authors: Vasileios Baltatzis, Luca Costabello
- Abstract summary: We present KGEx, a novel method that explains individual link predictions by drawing inspiration from surrogate models research.
Given a target triple to predict, KGEx trains surrogate KGE models that we use to identify important training triples.
We conduct extensive experiments on two publicly available datasets, to demonstrate that KGEx is capable of providing explanations faithful to the black-box model.
- Score: 6.332573781489264
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Despite being the go-to choice for link prediction on knowledge graphs,
research on interpretability of knowledge graph embeddings (KGE) has been
relatively unexplored. We present KGEx, a novel post-hoc method that explains
individual link predictions by drawing inspiration from surrogate models
research. Given a target triple to predict, KGEx trains surrogate KGE models
that we use to identify important training triples. To gauge the impact of a
training triple, we sample random portions of the target triple neighborhood
and we train multiple surrogate KGE models on each of them. To ensure
faithfulness, each surrogate is trained by distilling knowledge from the
original KGE model. We then assess how well surrogates predict the target
triple being explained, the intuition being that those leading to faithful
predictions have been trained on impactful neighborhood samples. Under this
assumption, we then harvest triples that appear frequently across impactful
neighborhoods. We conduct extensive experiments on two publicly available
datasets, to demonstrate that KGEx is capable of providing explanations
faithful to the black-box model.
Related papers
- Few-shot Knowledge Graph Relational Reasoning via Subgraph Adaptation [51.47994645529258]
Few-shot Knowledge Graph (KG) Reasoning aims to predict unseen triplets (i.e., query triplets) for rare relations in KGs.
We propose SAFER (Subgraph Adaptation for Few-shot Reasoning), a novel approach that effectively adapts the information in contextualized graphs to various subgraphs.
arXiv Detail & Related papers (2024-06-19T21:40:35Z) - Less is More: One-shot Subgraph Reasoning on Large-scale Knowledge Graphs [49.547988001231424]
We propose the one-shot-subgraph link prediction to achieve efficient and adaptive prediction.
Design principle is that, instead of directly acting on the whole KG, the prediction procedure is decoupled into two steps.
We achieve promoted efficiency and leading performances on five large-scale benchmarks.
arXiv Detail & Related papers (2024-03-15T12:00:12Z) - CausE: Towards Causal Knowledge Graph Embedding [13.016173217017597]
Knowledge graph embedding (KGE) focuses on representing the entities and relations of a knowledge graph (KG) into the continuous vector spaces.
We build the new paradigm of KGE in the context of causality and embedding disentanglement.
We propose a Causality-enhanced knowledge graph Embedding (CausE) framework.
arXiv Detail & Related papers (2023-07-21T14:25:39Z) - Multi-Aspect Explainable Inductive Relation Prediction by Sentence
Transformer [60.75757851637566]
We introduce the concepts of relation path coverage and relation path confidence to filter out unreliable paths prior to model training to elevate the model performance.
We propose Knowledge Reasoning Sentence Transformer (KRST) to predict inductive relations in knowledge graphs.
arXiv Detail & Related papers (2023-01-04T15:33:49Z) - Explaining Link Predictions in Knowledge Graph Embedding Models with
Influential Examples [8.892798396214065]
We study the problem of explaining link predictions in the Knowledge Graph Embedding (KGE) models.
We propose an example-based approach that exploits the latent space representation of nodes and edges in a knowledge graph to explain predictions.
arXiv Detail & Related papers (2022-12-05T23:19:02Z) - Knowledge Graph Completion with Pre-trained Multimodal Transformer and
Twins Negative Sampling [13.016173217017597]
We propose a VisualBERT-enhanced Knowledge Graph Completion model (VBKGC) for short.
VBKGC could capture deeply fused multimodal information for entities and integrate them into the KGC model.
We conduct extensive experiments to show the outstanding performance of VBKGC on the link prediction task.
arXiv Detail & Related papers (2022-09-15T06:50:31Z) - Repurposing Knowledge Graph Embeddings for Triple Representation via
Weak Supervision [77.34726150561087]
Current methods learn triple embeddings from scratch without utilizing entity and predicate embeddings from pre-trained models.
We develop a method for automatically sampling triples from a knowledge graph and estimating their pairwise similarities from pre-trained embedding models.
These pairwise similarity scores are then fed to a Siamese-like neural architecture to fine-tune triple representations.
arXiv Detail & Related papers (2022-08-22T14:07:08Z) - ExpressivE: A Spatio-Functional Embedding For Knowledge Graph Completion [78.8942067357231]
ExpressivE embeds pairs of entities as points and relations as hyper-parallelograms in the virtual triple space.
We show that ExpressivE is competitive with state-of-the-art KGEs and even significantly outperforms them on W18RR.
arXiv Detail & Related papers (2022-06-08T23:34:39Z) - CAKE: A Scalable Commonsense-Aware Framework For Multi-View Knowledge
Graph Completion [43.172893405453266]
Previous knowledge graph embedding techniques suffer from invalid negative sampling and the uncertainty of fact-view link prediction.
We propose a novel and scalable Commonsense-Aware Knowledge Embedding (CAKE) framework to automatically extract commonsense from factual triples with entity concepts.
arXiv Detail & Related papers (2022-02-25T03:30:22Z) - How Does Knowledge Graph Embedding Extrapolate to Unseen Data: a
Semantic Evidence View [13.575052133743505]
We study how does Knowledge Graph Embedding (KGE) extrapolate to unseen data.
We also propose a novel GNN-based KGE model, called Semantic Evidence aware Graph Neural Network (SE-GNN)
arXiv Detail & Related papers (2021-09-24T08:17:02Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.