Weakly Supervised Neuro-Symbolic Module Networks for Numerical Reasoning
- URL: http://arxiv.org/abs/2101.11802v1
- Date: Thu, 28 Jan 2021 03:36:09 GMT
- Title: Weakly Supervised Neuro-Symbolic Module Networks for Numerical Reasoning
- Authors: Amrita Saha, Shafiq Joty, Steven C.H. Hoi
- Abstract summary: We propose Weakly-Supervised Neuro-Symbolic Module Network (WNSMN) trained with answers as the sole supervision for numerical reasoning based.
It learns to execute a noisy MRC program obtained from the dependency parsing of the query, as discrete actions over both neural and symbolic reasoning modules and trains it end-to-end in a reinforcement learning framework with discrete reward from answer matching.
This showcases the effectiveness and generalizability of modular networks that can handle explicit discrete reasoning over noisy programs in an end-to-end manner.
- Score: 44.5641465035393
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Module Networks (NMNs) have been quite successful in incorporating
explicit reasoning as learnable modules in various question answering tasks,
including the most generic form of numerical reasoning over text in Machine
Reading Comprehension (MRC). However, to achieve this, contemporary NMNs need
strong supervision in executing the query as a specialized program over
reasoning modules and fail to generalize to more open-ended settings without
such supervision. Hence we propose Weakly-Supervised Neuro-Symbolic Module
Network (WNSMN) trained with answers as the sole supervision for numerical
reasoning based MRC. It learns to execute a noisy heuristic program obtained
from the dependency parsing of the query, as discrete actions over both neural
and symbolic reasoning modules and trains it end-to-end in a reinforcement
learning framework with discrete reward from answer matching. On the
numerical-answer subset of DROP, WNSMN out-performs NMN by 32% and the
reasoning-free language model GenBERT by 8% in exact match accuracy when
trained under comparable weak supervised settings. This showcases the
effectiveness and generalizability of modular networks that can handle explicit
discrete reasoning over noisy programs in an end-to-end manner.
Related papers
- Multimodal Representations for Teacher-Guided Compositional Visual
Reasoning [0.0]
NMNs provide enhanced explainability compared to integrated models.
We propose to exploit features obtained by a large-scale cross-modal encoder.
We introduce an NMN learning strategy involving scheduled teacher guidance.
arXiv Detail & Related papers (2023-10-24T07:51:08Z) - Mastering Symbolic Operations: Augmenting Language Models with Compiled
Neural Networks [48.14324895100478]
"Neural architecture" integrates compiled neural networks (CoNNs) into a standard transformer.
CoNNs are neural modules designed to explicitly encode rules through artificially generated attention weights.
Experiments demonstrate superiority of our approach over existing techniques in terms of length generalization, efficiency, and interpretability for symbolic operations.
arXiv Detail & Related papers (2023-04-04T09:50:07Z) - Logical Message Passing Networks with One-hop Inference on Atomic
Formulas [57.47174363091452]
We propose a framework for complex query answering that decomposes the Knowledge Graph embeddings from neural set operators.
On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning.
Our approach yields the new state-of-the-art neural CQA model.
arXiv Detail & Related papers (2023-01-21T02:34:06Z) - Teaching Neural Module Networks to Do Arithmetic [54.06832128723388]
We up-grade NMNs by bridging the gap between its interpreter and the complex questions.
We introduce addition and subtraction modules that perform numerical reasoning over numbers.
On a subset of DROP, experimental results show that our proposed methods enhance NMNs' numerical reasoning skills by 17.7% improvement of F1 score.
arXiv Detail & Related papers (2022-10-06T06:38:04Z) - How Modular Should Neural Module Networks Be for Systematic
Generalization? [4.533408938245526]
NMNs aim at Visual Question Answering (VQA) via composition of modules that tackle a sub-task.
In this paper, we demonstrate that the stage and the degree at which modularity is defined has large influence on systematic generalization.
arXiv Detail & Related papers (2021-06-15T14:13:47Z) - Question Answering over Knowledge Bases by Leveraging Semantic Parsing
and Neuro-Symbolic Reasoning [73.00049753292316]
We propose a semantic parsing and reasoning-based Neuro-Symbolic Question Answering(NSQA) system.
NSQA achieves state-of-the-art performance on QALD-9 and LC-QuAD 1.0.
arXiv Detail & Related papers (2020-12-03T05:17:55Z) - Obtaining Faithful Interpretations from Compositional Neural Networks [72.41100663462191]
We evaluate the intermediate outputs of NMNs on NLVR2 and DROP datasets.
We find that the intermediate outputs differ from the expected output, illustrating that the network structure does not provide a faithful explanation of model behaviour.
arXiv Detail & Related papers (2020-05-02T06:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.