Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs
- URL: http://arxiv.org/abs/2305.13585v1
- Date: Tue, 23 May 2023 01:25:29 GMT
- Title: Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs
- Authors: Siyuan Wang, Zhongyu Wei, Meng Han, Zhihao Fan, Haijun Shan, Qi Zhang,
Xuanjing Huang
- Abstract summary: We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
- Score: 67.043747188954
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Logical reasoning over incomplete knowledge graphs to answer complex logical
queries is a challenging task. With the emergence of new entities and relations
in constantly evolving KGs, inductive logical reasoning over KGs has become a
crucial problem. However, previous PLMs-based methods struggle to model the
logical structures of complex queries, which limits their ability to generalize
within the same structure. In this paper, we propose a structure-modeled
textual encoding framework for inductive logical reasoning over KGs. It encodes
linearized query structures and entities using pre-trained language models to
find answers. For structure modeling of complex queries, we design stepwise
instructions that implicitly prompt PLMs on the execution order of geometric
operations in each query. We further separately model different geometric
operations (i.e., projection, intersection, and union) on the representation
space using a pre-trained encoder with additional attention and maxout layers
to enhance structured modeling. We conduct experiments on two inductive logical
reasoning datasets and three transductive datasets. The results demonstrate the
effectiveness of our method on logical reasoning over KGs in both inductive and
transductive settings.
Related papers
- Improving Complex Reasoning over Knowledge Graph with Logic-Aware Curriculum Tuning [89.89857766491475]
We propose a complex reasoning schema over KG upon large language models (LLMs)
We augment the arbitrary first-order logical queries via binary tree decomposition to stimulate the reasoning capability of LLMs.
Experiments across widely used datasets demonstrate that LACT has substantial improvements(brings an average +5.5% MRR score) over advanced methods.
arXiv Detail & Related papers (2024-05-02T18:12:08Z) - Parrot Mind: Towards Explaining the Complex Task Reasoning of Pretrained Large Language Models with Template-Content Structure [66.33623392497599]
We show that a structure called template-content structure (T-C structure) can reduce the possible space from exponential level to linear level.
We demonstrate that models can achieve task composition, further reducing the space needed to learn from linear to logarithmic.
arXiv Detail & Related papers (2023-10-09T06:57:45Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Complex Logical Reasoning over Knowledge Graphs using Large Language Models [13.594992599230277]
Reasoning over knowledge graphs (KGs) is a challenging task that requires a deep understanding of the relationships between entities.
Current approaches rely on learning geometries to embed entities in vector space for logical query operations.
We propose a novel decoupled approach, Language-guided Abstract Reasoning over Knowledge graphs (LARK), that formulates complex KG reasoning as a combination of contextual KG search and logical query reasoning.
arXiv Detail & Related papers (2023-05-02T02:21:49Z) - Modeling Relational Patterns for Logical Query Answering over Knowledge Graphs [29.47155614953955]
We develop a novel query embedding method, RoConE, that defines query regions as geometric cones and algebraic query operators by rotations in complex space.
Our experimental results on several benchmark datasets confirm the advantage of relational patterns for enhancing logical query answering task.
arXiv Detail & Related papers (2023-03-21T13:59:15Z) - Unifying Structure Reasoning and Language Model Pre-training for Complex
Reasoning [26.811507121199323]
This paper proposes a unified learning framework that combines explicit structure reasoning and language pre-training to endow PLMs with the structure reasoning skill.
It first identifies several elementary structures within contexts to construct structured queries and performs step-by-step reasoning along the queries to identify the answer entity.
Experimental results on four datasets demonstrate that the proposed model achieves significant improvements in complex reasoning tasks involving diverse structures.
arXiv Detail & Related papers (2023-01-21T08:18:11Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.