Inductive Learning on Commonsense Knowledge Graph Completion
- URL: http://arxiv.org/abs/2009.09263v2
- Date: Wed, 17 Feb 2021 19:48:13 GMT
- Title: Inductive Learning on Commonsense Knowledge Graph Completion
- Authors: Bin Wang, Guangtao Wang, Jing Huang, Jiaxuan You, Jure Leskovec, C.-C.
Jay Kuo
- Abstract summary: Commonsense knowledge graph (CKG) is a special type of knowledge graph (CKG) where entities are composed of free-form text.
We propose to study the inductive learning setting for CKG completion where unseen entities may present at test time.
InductivE significantly outperforms state-of-the-art baselines in both standard and inductive settings on ATOMIC and ConceptNet benchmarks.
- Score: 89.72388313527296
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Commonsense knowledge graph (CKG) is a special type of knowledge graph (KG),
where entities are composed of free-form text. However, most existing CKG
completion methods focus on the setting where all the entities are presented at
training time. Although this setting is standard for conventional KG
completion, it has limitations for CKG completion. At test time, entities in
CKGs can be unseen because they may have unseen text/names and entities may be
disconnected from the training graph, since CKGs are generally very sparse.
Here, we propose to study the inductive learning setting for CKG completion
where unseen entities may present at test time. We develop a novel learning
framework named InductivE. Different from previous approaches, InductiveE
ensures the inductive learning capability by directly computing entity
embeddings from raw entity attributes/text. InductiveE consists of a free-text
encoder, a graph encoder, and a KG completion decoder. Specifically, the
free-text encoder first extracts the textual representation of each entity
based on the pre-trained language model and word embedding. The graph encoder
is a gated relational graph convolutional neural network that learns from a
densified graph for more informative entity representation learning. We develop
a method that densifies CKGs by adding edges among semantic-related entities
and provide more supportive information for unseen entities, leading to better
generalization ability of entity embedding for unseen entities. Finally,
inductiveE employs Conv-TransE as the CKG completion decoder. Experimental
results show that InductiveE significantly outperforms state-of-the-art
baselines in both standard and inductive settings on ATOMIC and ConceptNet
benchmarks. InductivE performs especially well on inductive scenarios where it
achieves above 48% improvement over present methods.
Related papers
- Towards Better Benchmark Datasets for Inductive Knowledge Graph Completion [34.58496513149175]
We find that the current procedure for constructing inductive KGC datasets inadvertently creates a shortcut that can be exploited.
Specifically, we observe that the Personalized PageRank (PPR) score can achieve strong or near SOTA performance on most inductive datasets.
We propose an alternative strategy for constructing inductive KGC datasets that helps mitigate the PPR shortcut.
arXiv Detail & Related papers (2024-06-14T21:01:46Z) - Logical Reasoning with Relation Network for Inductive Knowledge Graph Completion [9.815135283458808]
We propose a novel iNfOmax RelAtion Network, namely NORAN, for inductive KG completion.
Our framework substantially outperforms the state-of-the-art KGC methods.
arXiv Detail & Related papers (2024-06-03T09:30:43Z) - EntailE: Introducing Textual Entailment in Commonsense Knowledge Graph
Completion [54.12709176438264]
Commonsense knowledge graphs (CSKGs) utilize free-form text to represent named entities, short phrases, and events as their nodes.
Current methods leverage semantic similarities to increase the graph density, but the semantic plausibility of the nodes and their relations are under-explored.
We propose to adopt textual entailment to find implicit entailment relations between CSKG nodes, to effectively densify the subgraph connecting nodes within the same conceptual class.
arXiv Detail & Related papers (2024-02-15T02:27:23Z) - Contextualization Distillation from Large Language Model for Knowledge
Graph Completion [51.126166442122546]
We introduce the Contextualization Distillation strategy, a plug-in-and-play approach compatible with both discriminative and generative KGC frameworks.
Our method begins by instructing large language models to transform compact, structural triplets into context-rich segments.
Comprehensive evaluations across diverse datasets and KGC techniques highlight the efficacy and adaptability of our approach.
arXiv Detail & Related papers (2024-01-28T08:56:49Z) - Unifying Structure and Language Semantic for Efficient Contrastive
Knowledge Graph Completion with Structured Entity Anchors [0.3913403111891026]
The goal of knowledge graph completion (KGC) is to predict missing links in a KG using trained facts that are already known.
We propose a novel method to effectively unify structure information and language semantics without losing the power of inductive reasoning.
arXiv Detail & Related papers (2023-11-07T11:17:55Z) - Few-Shot Inductive Learning on Temporal Knowledge Graphs using
Concept-Aware Information [31.10140298420744]
We propose a few-shot out-of-graph (OOG) link prediction task for temporal knowledge graphs (TKGs)
We predict the missing entities from the links concerning unseen entities by employing a meta-learning framework.
Our model achieves superior performance on all three datasets.
arXiv Detail & Related papers (2022-11-15T14:23:07Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Visual Pivoting for (Unsupervised) Entity Alignment [93.82387952905756]
This work studies the use of visual semantic representations to align entities in heterogeneous knowledge graphs (KGs)
We show that the proposed new approach, EVA, creates a holistic entity representation that provides strong signals for cross-graph entity alignment.
arXiv Detail & Related papers (2020-09-28T20:09:40Z) - Inductively Representing Out-of-Knowledge-Graph Entities by Optimal
Estimation Under Translational Assumptions [42.626395991024545]
We propose a simple and effective method that inductively represents OOKG entities by their optimal estimation under translational assumptions.
Experimental results show that our method outperforms the state-of-the-art methods with higher efficiency on two KGC tasks with OOKG entities.
arXiv Detail & Related papers (2020-09-27T07:12:18Z) - KACC: A Multi-task Benchmark for Knowledge Abstraction, Concretization
and Completion [99.47414073164656]
A comprehensive knowledge graph (KG) contains an instance-level entity graph and an ontology-level concept graph.
The two-view KG provides a testbed for models to "simulate" human's abilities on knowledge abstraction, concretization, and completion.
We propose a unified KG benchmark by improving existing benchmarks in terms of dataset scale, task coverage, and difficulty.
arXiv Detail & Related papers (2020-04-28T16:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.