Knowledge Graph Completion with Counterfactual Augmentation
- URL: http://arxiv.org/abs/2302.13083v1
- Date: Sat, 25 Feb 2023 14:08:15 GMT
- Title: Knowledge Graph Completion with Counterfactual Augmentation
- Authors: Heng Chang, Jie Cai, Jia Li
- Abstract summary: We introduce a counterfactual question: "would the relation still exist if the neighborhood of entities became different from observation?"
With a carefully designed instantiation of a causal model on the knowledge graph, we generate the counterfactual relations to answer the question.
We incorporate the created counterfactual relations with the GNN-based framework on KGs to augment their learning of entity pair representations.
- Score: 23.20561746976504
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have demonstrated great success in Knowledge
Graph Completion (KGC) by modeling how entities and relations interact in
recent years. However, most of them are designed to learn from the observed
graph structure, which appears to have imbalanced relation distribution during
the training stage. Motivated by the causal relationship among the entities on
a knowledge graph, we explore this defect through a counterfactual question:
"would the relation still exist if the neighborhood of entities became
different from observation?". With a carefully designed instantiation of a
causal model on the knowledge graph, we generate the counterfactual relations
to answer the question by regarding the representations of entity pair given
relation as context, structural information of relation-aware neighborhood as
treatment, and validity of the composed triplet as the outcome. Furthermore, we
incorporate the created counterfactual relations with the GNN-based framework
on KGs to augment their learning of entity pair representations from both the
observed and counterfactual relations. Experiments on benchmarks show that our
proposed method outperforms existing methods on the task of KGC, achieving new
state-of-the-art results. Moreover, we demonstrate that the proposed
counterfactual relations-based augmentation also enhances the interpretability
of the GNN-based framework through the path interpretations of predictions.
Related papers
- Introducing Diminutive Causal Structure into Graph Representation Learning [19.132025125620274]
We introduce a novel method that enables Graph Neural Networks (GNNs) to glean insights from specialized diminutive causal structures.
Our method specifically extracts causal knowledge from the model representation of these diminutive causal structures.
arXiv Detail & Related papers (2024-06-13T00:18:20Z) - Relating-Up: Advancing Graph Neural Networks through Inter-Graph Relationships [17.978546172777342]
Graph Neural Networks (GNNs) have excelled in learning from graph-structured data.
Despite their successes, GNNs are limited by neglecting the context of relationships across graphs.
We introduce Relating-Up, a plug-and-play module that enhances GNNs by exploiting inter-graph relationships.
arXiv Detail & Related papers (2024-05-07T02:16:54Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - zrLLM: Zero-Shot Relational Learning on Temporal Knowledge Graphs with Large Language Models [33.10218179341504]
We use large language models to generate relation representations for embedding-based TKGF methods.
We show that our approach helps TKGF models to achieve much better performance in forecasting the facts with previously unseen relations.
arXiv Detail & Related papers (2023-11-15T21:25:15Z) - Learning Complete Topology-Aware Correlations Between Relations for Inductive Link Prediction [121.65152276851619]
We show that semantic correlations between relations are inherently edge-level and entity-independent.
We propose a novel subgraph-based method, namely TACO, to model Topology-Aware COrrelations between relations.
To further exploit the potential of RCN, we propose Complete Common Neighbor induced subgraph.
arXiv Detail & Related papers (2023-09-20T08:11:58Z) - Message Intercommunication for Inductive Relation Reasoning [49.731293143079455]
We develop a novel inductive relation reasoning model called MINES.
We introduce a Message Intercommunication mechanism on the Neighbor-Enhanced Subgraph.
Our experiments show that MINES outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2023-05-23T13:51:46Z) - Topology-Aware Correlations Between Relations for Inductive Link
Prediction in Knowledge Graphs [41.38172189254483]
TACT is inspired by the observation that the semantic correlation between two relations is highly correlated to their topological knowledge graphs.
We categorize all relation pairs into several topological patterns then propose a structure in Correlation Network (RCN) to learn the importance of the different patterns for inductive link prediction.
Experiments demonstrate that TACT can effectively model semantic correlations between relations, and significantly outperforms existing state-of-the-art methods on benchmark datasets.
arXiv Detail & Related papers (2021-03-05T13:00:10Z) - RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding [50.010601631982425]
This paper extends the random walk model (Arora et al., 2016a) of word embeddings to Knowledge Graph Embeddings (KGEs)
We derive a scoring function that evaluates the strength of a relation R between two entities h (head) and t (tail)
We propose a learning objective motivated by the theoretical analysis to learn KGEs from a given knowledge graph.
arXiv Detail & Related papers (2021-01-25T13:31:29Z) - Learning to Decouple Relations: Few-Shot Relation Classification with
Entity-Guided Attention and Confusion-Aware Training [49.9995628166064]
We propose CTEG, a model equipped with two mechanisms to learn to decouple easily-confused relations.
On the one hand, an EGA mechanism is introduced to guide the attention to filter out information causing confusion.
On the other hand, a Confusion-Aware Training (CAT) method is proposed to explicitly learn to distinguish relations.
arXiv Detail & Related papers (2020-10-21T11:07:53Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.