AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine
Reading Comprehension
- URL: http://arxiv.org/abs/2203.08992v1
- Date: Wed, 16 Mar 2022 23:51:01 GMT
- Title: AdaLoGN: Adaptive Logic Graph Network for Reasoning-Based Machine
Reading Comprehension
- Authors: Xiao Li, Gong Cheng, Ziheng Chen, Yawei Sun, Yuzhong Qu
- Abstract summary: Recent machine reading comprehension datasets such as ReClor and LogiQA require performing logical reasoning over text.
We present a neural-symbolic approach which, to predict an answer, passes messages over a graph representing logical relations between text units.
- Score: 21.741085513119785
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent machine reading comprehension datasets such as ReClor and LogiQA
require performing logical reasoning over text. Conventional neural models are
insufficient for logical reasoning, while symbolic reasoners cannot directly
apply to text. To meet the challenge, we present a neural-symbolic approach
which, to predict an answer, passes messages over a graph representing logical
relations between text units. It incorporates an adaptive logic graph network
(AdaLoGN) which adaptively infers logical relations to extend the graph and,
essentially, realizes mutual and iterative reinforcement between neural and
symbolic reasoning. We also implement a novel subgraph-to-node message passing
mechanism to enhance context-option interaction for answering multiple-choice
questions. Our approach shows promising results on ReClor and LogiQA.
Related papers
- LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Discourse-Aware Graph Networks for Textual Logical Reasoning [142.0097357999134]
Passage-level logical relations represent entailment or contradiction between propositional units (e.g., a concluding sentence)
We propose logic structural-constraint modeling to solve the logical reasoning QA and introduce discourse-aware graph networks (DAGNs)
The networks first construct logic graphs leveraging in-line discourse connectives and generic logic theories, then learn logic representations by end-to-end evolving the logic relations with an edge-reasoning mechanism and updating the graph features.
arXiv Detail & Related papers (2022-07-04T14:38:49Z) - Neural-Symbolic Models for Logical Queries on Knowledge Graphs [17.290758383645567]
We propose Graph Neural Network Query Executor (GNN-QE), a neural-symbolic model that enjoys the advantages of both worlds.
GNN-QE decomposes a complex FOL query into relation projections and logical operations over fuzzy sets.
Experiments on 3 datasets show that GNN-QE significantly improves over previous state-of-the-art models in answering FOL queries.
arXiv Detail & Related papers (2022-05-16T18:39:04Z) - Logiformer: A Two-Branch Graph Transformer Network for Interpretable
Logical Reasoning [10.716971124214332]
We propose an end-to-end model Logiformer which utilizes a two-branch graph transformer network for logical reasoning of text.
We introduce different extraction strategies to split the text into two sets of logical units, and construct the logical graph and the syntax graph respectively.
The reasoning process provides the interpretability by employing the logical units, which are consistent with human cognition.
arXiv Detail & Related papers (2022-05-02T08:34:59Z) - DAGN: Discourse-Aware Graph Network for Logical Reasoning [83.8041050565304]
We propose a discourse-aware graph network (DAGN) that reasons relying on the discourse structure of the texts.
The model encodes discourse information as a graph with elementary discourse units (EDUs) and discourse relations, and learns the discourse-aware features via a graph network for downstream QA tasks.
arXiv Detail & Related papers (2021-03-26T09:41:56Z) - Question Answering over Knowledge Bases by Leveraging Semantic Parsing
and Neuro-Symbolic Reasoning [73.00049753292316]
We propose a semantic parsing and reasoning-based Neuro-Symbolic Question Answering(NSQA) system.
NSQA achieves state-of-the-art performance on QALD-9 and LC-QuAD 1.0.
arXiv Detail & Related papers (2020-12-03T05:17:55Z) - Multi-Step Inference for Reasoning Over Paragraphs [95.91527524872832]
Complex reasoning over text requires understanding and chaining together free-form predicates and logical connectives.
We present a compositional model reminiscent of neural module networks that can perform chained logical reasoning.
arXiv Detail & Related papers (2020-04-06T21:12:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.