Learning Compositional Rules via Neural Program Synthesis
- URL: http://arxiv.org/abs/2003.05562v2
- Date: Thu, 22 Oct 2020 19:48:43 GMT
- Title: Learning Compositional Rules via Neural Program Synthesis
- Authors: Maxwell I. Nye, Armando Solar-Lezama, Joshua B. Tenenbaum, Brenden M.
Lake
- Abstract summary: We present a neuro-symbolic model which learns entire rule systems from a small set of examples.
Instead of directly predicting outputs from inputs, we train our model to induce the explicit system of rules governing a set of previously seen examples.
- Score: 67.62112086708859
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many aspects of human reasoning, including language, require learning rules
from very little data. Humans can do this, often learning systematic rules from
very few examples, and combining these rules to form compositional rule-based
systems. Current neural architectures, on the other hand, often fail to
generalize in a compositional manner, especially when evaluated in ways that
vary systematically from training. In this work, we present a neuro-symbolic
model which learns entire rule systems from a small set of examples. Instead of
directly predicting outputs from inputs, we train our model to induce the
explicit system of rules governing a set of previously seen examples, drawing
upon techniques from the neural program synthesis literature. Our
rule-synthesis approach outperforms neural meta-learning techniques in three
domains: an artificial instruction-learning domain used to evaluate human
learning, the SCAN challenge datasets, and learning rule-based translations of
number words into integers for a wide range of human languages.
Related papers
- Compositional Program Generation for Few-Shot Systematic Generalization [59.57656559816271]
This study on a neuro-symbolic architecture called the Compositional Program Generator (CPG)
CPG has three key features: textitmodularity, textitcomposition, and textitabstraction, in the form of grammar rules.
It perfect achieves generalization on both the SCAN and COGS benchmarks using just 14 examples for SCAN and 22 examples for COGS.
arXiv Detail & Related papers (2023-09-28T14:33:20Z) - Learning Symbolic Rules over Abstract Meaning Representations for
Textual Reinforcement Learning [63.148199057487226]
We propose a modular, NEuroSymbolic Textual Agent (NESTA) that combines a generic semantic generalization with a rule induction system to learn interpretable rules as policies.
Our experiments show that the proposed NESTA method outperforms deep reinforcement learning-based techniques by achieving better to unseen test games and learning from fewer training interactions.
arXiv Detail & Related papers (2023-07-05T23:21:05Z) - Neural-Symbolic Recursive Machine for Systematic Generalization [113.22455566135757]
We introduce the Neural-Symbolic Recursive Machine (NSR), whose core is a Grounded Symbol System (GSS)
NSR integrates neural perception, syntactic parsing, and semantic reasoning.
We evaluate NSR's efficacy across four challenging benchmarks designed to probe systematic generalization capabilities.
arXiv Detail & Related papers (2022-10-04T13:27:38Z) - A Neural Model for Regular Grammar Induction [8.873449722727026]
We treat grammars as a model of computation and propose a novel neural approach to induction of regular grammars from positive and negative examples.
Our model is fully explainable, its intermediate results are directly interpretable as partial parses, and it can be used to learn arbitrary regular grammars when provided with sufficient data.
arXiv Detail & Related papers (2022-09-23T14:53:23Z) - Dependency-based Mixture Language Models [53.152011258252315]
We introduce the Dependency-based Mixture Language Models.
In detail, we first train neural language models with a novel dependency modeling objective.
We then formulate the next-token probability by mixing the previous dependency modeling probability distributions with self-attention.
arXiv Detail & Related papers (2022-03-19T06:28:30Z) - Non-neural Models Matter: A Re-evaluation of Neural Referring Expression
Generation Systems [6.651864489482537]
In recent years, neural models have often outperformed rule-based and classic Machine Learning approaches in NLG.
We argue that they should not be overlooked, since, for some tasks, well-designed non-neural approaches achieve better performance than neural ones.
arXiv Detail & Related papers (2022-03-15T21:47:25Z) - Sample-efficient Linguistic Generalizations through Program Synthesis:
Experiments with Phonology Problems [12.661592819420727]
We develop a synthesis model to learn phonology rules as programs in a domain-specific language.
We test the ability of our models to generalize from few training examples using our new dataset of problems from the Linguistics Olympiad.
arXiv Detail & Related papers (2021-06-11T18:36:07Z) - NSL: Hybrid Interpretable Learning From Noisy Raw Data [66.15862011405882]
This paper introduces a hybrid neural-symbolic learning framework, called NSL, that learns interpretable rules from labelled unstructured data.
NSL combines pre-trained neural networks for feature extraction with FastLAS, a state-of-the-art ILP system for rule learning under the answer set semantics.
We demonstrate that NSL is able to learn robust rules from MNIST data and achieve comparable or superior accuracy when compared to neural network and random forest baselines.
arXiv Detail & Related papers (2020-12-09T13:02:44Z) - Systematic Generalization on gSCAN with Language Conditioned Embedding [19.39687991647301]
Systematic Generalization refers to a learning algorithm's ability to extrapolate learned behavior to unseen situations.
We propose a novel method that learns objects' contextualized embeddings with dynamic message passing conditioned on the input natural language.
arXiv Detail & Related papers (2020-09-11T17:35:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.