EngineKGI: Closed-Loop Knowledge Graph Inference
- URL: http://arxiv.org/abs/2112.01040v1
- Date: Thu, 2 Dec 2021 08:02:59 GMT
- Title: EngineKGI: Closed-Loop Knowledge Graph Inference
- Authors: Guanglin Niu, Bo Li, Yongfei Zhang, Shiliang Pu
- Abstract summary: EngineKGI is a novel closed-loop KG inference framework.
It combines KGE and rule learning to complement each other in a closed-loop pattern.
Our model outperforms other baselines on link prediction tasks.
- Score: 37.15381932994768
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Knowledge Graph (KG) inference is the vital technique to address the natural
incompleteness of KGs. The existing KG inference approaches can be classified
into rule learning-based and KG embedding-based models. However, these
approaches cannot well balance accuracy, generalization, interpretability and
efficiency, simultaneously. Besides, these models always rely on pure triples
and neglect additional information. Therefore, both KG embedding (KGE) and rule
learning KG inference approaches face challenges due to the sparse entities and
the limited semantics. We propose a novel and effective closed-loop KG
inference framework EngineKGI operating similarly as an engine based on these
observations. EngineKGI combines KGE and rule learning to complement each other
in a closed-loop pattern while taking advantage of semantics in paths and
concepts. KGE module exploits paths to enhance the semantic association between
entities and introduces rules for interpretability. A novel rule pruning
mechanism is proposed in the rule learning module by leveraging paths as
initial candidate rules and employing KG embeddings together with concepts for
extracting more high-quality rules. Experimental results on four real-world
datasets show that our model outperforms other baselines on link prediction
tasks, demonstrating the effectiveness and superiority of our model on KG
inference in a joint logic and data-driven fashion with a closed-loop
mechanism.
Related papers
- Decoding on Graphs: Faithful and Sound Reasoning on Knowledge Graphs through Generation of Well-Formed Chains [66.55612528039894]
Knowledge Graphs (KGs) can serve as reliable knowledge sources for question answering (QA)
We present DoG (Decoding on Graphs), a novel framework that facilitates a deep synergy between LLMs and KGs.
Experiments across various KGQA tasks with different background KGs demonstrate that DoG achieves superior and robust performance.
arXiv Detail & Related papers (2024-10-24T04:01:40Z) - A Pluggable Common Sense-Enhanced Framework for Knowledge Graph Completion [9.686794547679076]
We propose a pluggable common sense-enhanced KGC framework that incorporates both fact and common sense for KGC.
This framework is adaptable to different KGs based on their entity concept richness and has the capability to automatically generate explicit or implicit common sense.
Our approach can be integrated as a pluggable module for many knowledge graph embedding (KGE) models.
arXiv Detail & Related papers (2024-10-06T14:06:12Z) - Learning Rules from KGs Guided by Language Models [48.858741745144044]
Rule learning methods can be applied to predict potentially missing facts.
Ranking of rules is especially challenging over highly incomplete or biased KGs.
With the recent rise of Language Models (LMs) several works have claimed that LMs can be used as alternative means for KG completion.
arXiv Detail & Related papers (2024-09-12T09:27:36Z) - Knowledge Graph Embedding: An Overview [42.16033541753744]
We make a comprehensive overview of the current state of research in Knowledge Graph completion.
We focus on two main branches of KG embedding (KGE) design: 1) distance-based methods and 2) semantic matching-based methods.
Next, we delve into CompoundE and CompoundE3D, which draw inspiration from 2D and 3D affine operations.
arXiv Detail & Related papers (2023-09-21T21:52:42Z) - ChatRule: Mining Logical Rules with Large Language Models for Knowledge
Graph Reasoning [107.61997887260056]
We propose a novel framework, ChatRule, unleashing the power of large language models for mining logical rules over knowledge graphs.
Specifically, the framework is initiated with an LLM-based rule generator, leveraging both the semantic and structural information of KGs.
To refine the generated rules, a rule ranking module estimates the rule quality by incorporating facts from existing KGs.
arXiv Detail & Related papers (2023-09-04T11:38:02Z) - GreenKGC: A Lightweight Knowledge Graph Completion Method [32.528770408502396]
GreenKGC aims to discover missing relationships between entities in knowledge graphs.
It consists of three modules: representation learning, feature pruning, and decision learning.
In low dimensions, GreenKGC can outperform SOTA methods in most datasets.
arXiv Detail & Related papers (2022-08-19T03:33:45Z) - Explainable Sparse Knowledge Graph Completion via High-order Graph
Reasoning Network [111.67744771462873]
This paper proposes a novel explainable model for sparse Knowledge Graphs (KGs)
It combines high-order reasoning into a graph convolutional network, namely HoGRN.
It can not only improve the generalization ability to mitigate the information insufficiency issue but also provide interpretability.
arXiv Detail & Related papers (2022-07-14T10:16:56Z) - Towards Robust Knowledge Graph Embedding via Multi-task Reinforcement
Learning [44.38215560989223]
Most existing knowledge graph embedding methods assume that all the triple facts in KGs are correct.
This will lead to low-quality and unreliable representations of KGs.
We propose a general multi-task reinforcement learning framework, which can greatly alleviate the noisy data problem.
arXiv Detail & Related papers (2021-11-11T08:51:37Z) - What is Learned in Knowledge Graph Embeddings? [3.224929252256631]
A knowledge graph (KG) is a data structure which represents entities and relations as the vertices and edges of a directed graph with edge types.
We investigate whether learning rules between relations is indeed what drives the performance of embedding-based methods.
Using experiments on synthetic KGs, we show that KG models can learn motifs and how this ability is degraded by non-motif edges.
arXiv Detail & Related papers (2021-10-19T13:52:11Z) - On the Role of Conceptualization in Commonsense Knowledge Graph
Construction [59.39512925793171]
Commonsense knowledge graphs (CKGs) like Atomic and ASER are substantially different from conventional KGs.
We introduce to CKG construction methods conceptualization to view entities mentioned in text as instances of specific concepts or vice versa.
Our methods can effectively identify plausible triples and expand the KG by triples of both new nodes and edges of high diversity and novelty.
arXiv Detail & Related papers (2020-03-06T14:35:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.