Enhancing Neural Mathematical Reasoning by Abductive Combination with
Symbolic Library
- URL: http://arxiv.org/abs/2203.14487v1
- Date: Mon, 28 Mar 2022 04:19:39 GMT
- Title: Enhancing Neural Mathematical Reasoning by Abductive Combination with
Symbolic Library
- Authors: Yangyang Hu, Yang Yu
- Abstract summary: This paper demonstrates that some abilities can be achieved through abductive combination with discrete systems that have been programmed with human knowledge.
On a mathematical reasoning dataset, we adopt the recently proposed abductive learning framework, and propose the ABL-Sym algorithm that combines the Transformer models with a symbolic mathematics library.
- Score: 5.339286921277565
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Mathematical reasoning recently has been shown as a hard challenge for neural
systems. Abilities including expression translation, logical reasoning, and
mathematics knowledge acquiring appear to be essential to overcome the
challenge. This paper demonstrates that some abilities can be achieved through
abductive combination with discrete systems that have been programmed with
human knowledge. On a mathematical reasoning dataset, we adopt the recently
proposed abductive learning framework, and propose the ABL-Sym algorithm that
combines the Transformer neural models with a symbolic mathematics library.
ABL-Sym shows 9.73% accuracy improvement on the interpolation tasks and 47.22%
accuracy improvement on the extrapolation tasks, over the state-of-the-art
approaches. Online demonstration: http://math.polixir.ai
Related papers
- Brain-Inspired Two-Stage Approach: Enhancing Mathematical Reasoning by
Imitating Human Thought Processes [6.512667145063511]
We propose a novel approach, named Brain, to imitate human thought processes to enhance mathematical reasoning abilities.
First, we achieve SOTA performance in comparison with Code LLaMA 7B based models through this method.
Secondly, we find that plans can be explicitly extracted from natural language, code, or formal language.
arXiv Detail & Related papers (2024-02-23T17:40:31Z) - The Role of Foundation Models in Neuro-Symbolic Learning and Reasoning [54.56905063752427]
Neuro-Symbolic AI (NeSy) holds promise to ensure the safe deployment of AI systems.
Existing pipelines that train the neural and symbolic components sequentially require extensive labelling.
New architecture, NeSyGPT, fine-tunes a vision-language foundation model to extract symbolic features from raw data.
arXiv Detail & Related papers (2024-02-02T20:33:14Z) - Bridging Logic and Learning: A Neural-Symbolic Approach for Enhanced
Reasoning in Neural Models (ASPER) [0.13053649021965597]
This paper introduces an approach designed to improve the performance of neural models in learning reasoning tasks.
It achieves this by integrating Answer Set Programming solvers and domain-specific expertise.
The model shows a significant improvement in solving Sudoku puzzles using only 12 puzzles for training and testing.
arXiv Detail & Related papers (2023-12-18T19:06:00Z) - Generating by Understanding: Neural Visual Generation with Logical
Symbol Groundings [26.134405924834525]
We propose a neurosymbolic learning approach, Abductive visual Generation (AbdGen), for integrating logic programming systems with neural visual generative models.
Results show that compared to the baseline approaches, AbdGen requires significantly less labeled data for symbol assignment.
AbdGen can effectively learn underlying logical generative rules from data, which is out of the capability of existing approaches.
arXiv Detail & Related papers (2023-10-26T15:00:21Z) - Learning to solve arithmetic problems with a virtual abacus [0.35911228556176483]
We introduce a deep reinforcement learning framework that allows to simulate how cognitive agents could learn to solve arithmetic problems.
The proposed model successfully learns to perform multi-digit additions and subtractions, achieving an error rate below 1%.
We analyze the most common error patterns to better understand the limitations and biases resulting from our design choices.
arXiv Detail & Related papers (2023-01-17T13:25:52Z) - A Survey of Deep Learning for Mathematical Reasoning [71.88150173381153]
We review the key tasks, datasets, and methods at the intersection of mathematical reasoning and deep learning over the past decade.
Recent advances in large-scale neural language models have opened up new benchmarks and opportunities to use deep learning for mathematical reasoning.
arXiv Detail & Related papers (2022-12-20T18:46:16Z) - Neuro-Symbolic Learning of Answer Set Programs from Raw Data [54.56905063752427]
Neuro-Symbolic AI aims to combine interpretability of symbolic techniques with the ability of deep learning to learn from raw data.
We introduce Neuro-Symbolic Inductive Learner (NSIL), an approach that trains a general neural network to extract latent concepts from raw data.
NSIL learns expressive knowledge, solves computationally complex problems, and achieves state-of-the-art performance in terms of accuracy and data efficiency.
arXiv Detail & Related papers (2022-05-25T12:41:59Z) - Recognizing and Verifying Mathematical Equations using Multiplicative
Differential Neural Units [86.9207811656179]
We show that memory-augmented neural networks (NNs) can achieve higher-order, memory-augmented extrapolation, stable performance, and faster convergence.
Our models achieve a 1.53% average improvement over current state-of-the-art methods in equation verification and achieve a 2.22% Top-1 average accuracy and 2.96% Top-5 average accuracy for equation completion.
arXiv Detail & Related papers (2021-04-07T03:50:11Z) - SMART: A Situation Model for Algebra Story Problems via Attributed
Grammar [74.1315776256292]
We introduce the concept of a emphsituation model, which originates from psychology studies to represent the mental states of humans in problem-solving.
We show that the proposed model outperforms all previous neural solvers by a large margin while preserving much better interpretability.
arXiv Detail & Related papers (2020-12-27T21:03:40Z) - Machine Number Sense: A Dataset of Visual Arithmetic Problems for
Abstract and Relational Reasoning [95.18337034090648]
We propose a dataset, Machine Number Sense (MNS), consisting of visual arithmetic problems automatically generated using a grammar model--And-Or Graph (AOG)
These visual arithmetic problems are in the form of geometric figures.
We benchmark the MNS dataset using four predominant neural network models as baselines in this visual reasoning task.
arXiv Detail & Related papers (2020-04-25T17:14:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.