PathReasoner: Modeling Reasoning Path with Equivalent Extension for Logical Question Answering
- URL: http://arxiv.org/abs/2405.19109v1
- Date: Wed, 29 May 2024 14:14:05 GMT
- Title: PathReasoner: Modeling Reasoning Path with Equivalent Extension for Logical Question Answering
- Authors: Fangzhi Xu, Qika Lin, Tianzhe Zhao, Jiawei Han, Jun Liu,
- Abstract summary: We model the logical reasoning task by transforming each logical sample into reasoning paths.
To expand the diversity of the logical samples, we propose an atom extension strategy supported by equivalent logical formulas.
Experiments show that PathReasoner achieves competitive performances on two logical reasoning benchmarks and great generalization abilities.
- Score: 27.50008553118866
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Logical reasoning task has attracted great interest since it was proposed. Faced with such a task, current competitive models, even large language models (e.g., ChatGPT and PaLM 2), still perform badly. Previous promising LMs struggle in logical consistency modeling and logical structure perception. To this end, we model the logical reasoning task by transforming each logical sample into reasoning paths and propose an architecture \textbf{PathReasoner}. It addresses the task from the views of both data and model. To expand the diversity of the logical samples, we propose an atom extension strategy supported by equivalent logical formulas, to form new reasoning paths. From the model perspective, we design a stack of transformer-style blocks. In particular, we propose a path-attention module to joint model in-atom and cross-atom relations with the high-order diffusion strategy. Experiments show that PathReasoner achieves competitive performances on two logical reasoning benchmarks and great generalization abilities.
Related papers
- Toward Conceptual Modeling for Propositional Logic: Propositions as Events [0.0]
This paper reflects on applying propositional logic language to a high-level diagrammatic representation called the thinging machines (TM) model.
The ultimate research objective is a quest for a thorough semantic alignment of TM modeling and propositional logic into a single structure.
arXiv Detail & Related papers (2024-09-24T03:45:24Z) - On the Diagram of Thought [12.304069891580658]
We introduce Diagram of Thought (DoT), a framework that models iterative reasoning in large language models (LLMs)
DoT organizes propositions, critiques, refinements, and verifications into a cohesive DAG structure, allowing the model to explore complex reasoning pathways.
We formalize the DoT framework using Topos Theory, providing a mathematical foundation that ensures logical consistency and soundness in the reasoning process.
arXiv Detail & Related papers (2024-09-16T07:01:41Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Logic Diffusion for Knowledge Graph Reasoning [29.260922651325412]
We propose a plug-in module called Logic Diffusion (LoD) to discover unseen queries from surroundings.
LoD achieves dynamical equilibrium between different kinds of patterns.
Experiments on four public datasets demonstrate the superiority of mainstream knowledge graph reasoning models with LoD over state-of-the-art.
arXiv Detail & Related papers (2023-06-06T09:01:17Z) - Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs [67.043747188954]
We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
arXiv Detail & Related papers (2023-05-23T01:25:29Z) - MetaLogic: Logical Reasoning Explanations with Fine-Grained Structure [129.8481568648651]
We propose a benchmark to investigate models' logical reasoning capabilities in complex real-life scenarios.
Based on the multi-hop chain of reasoning, the explanation form includes three main components.
We evaluate the current best models' performance on this new explanation form.
arXiv Detail & Related papers (2022-10-22T16:01:13Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - Logiformer: A Two-Branch Graph Transformer Network for Interpretable
Logical Reasoning [10.716971124214332]
We propose an end-to-end model Logiformer which utilizes a two-branch graph transformer network for logical reasoning of text.
We introduce different extraction strategies to split the text into two sets of logical units, and construct the logical graph and the syntax graph respectively.
The reasoning process provides the interpretability by employing the logical units, which are consistent with human cognition.
arXiv Detail & Related papers (2022-05-02T08:34:59Z) - Multi-Step Inference for Reasoning Over Paragraphs [95.91527524872832]
Complex reasoning over text requires understanding and chaining together free-form predicates and logical connectives.
We present a compositional model reminiscent of neural module networks that can perform chained logical reasoning.
arXiv Detail & Related papers (2020-04-06T21:12:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.