Hierarchical Neural Program Synthesis
- URL: http://arxiv.org/abs/2303.06018v1
- Date: Thu, 9 Mar 2023 18:20:07 GMT
- Title: Hierarchical Neural Program Synthesis
- Authors: Linghan Zhong, Ryan Lindeborg, Jesse Zhang, Joseph J. Lim, Shao-Hua
Sun
- Abstract summary: Program synthesis aims to automatically construct human-readable programs that satisfy given task specifications.
We present a scalable program synthesis framework that instead synthesizes a program by hierarchically composing programs.
We extensively evaluate our proposed framework in a string transformation domain with input/output pairs.
- Score: 19.94176152035497
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Program synthesis aims to automatically construct human-readable programs
that satisfy given task specifications, such as input/output pairs or
demonstrations. Recent works have demonstrated encouraging results in a variety
of domains, such as string transformation, tensor manipulation, and describing
behaviors of embodied agents. Most existing program synthesis methods are
designed to synthesize programs from scratch, generating a program token by
token, line by line. This fundamentally prevents these methods from scaling up
to synthesize programs that are longer or more complex. In this work, we
present a scalable program synthesis framework that instead synthesizes a
program by hierarchically composing programs. Specifically, we first learn a
task embedding space and a program decoder that can decode a task embedding
into a program. Then, we train a high-level module to comprehend the task
specification (e.g., input/output pairs or demonstrations) from long programs
and produce a sequence of task embeddings, which are then decoded by the
program decoder and composed to yield the synthesized program. We extensively
evaluate our proposed framework in a string transformation domain with
input/output pairs. The experimental results demonstrate that the proposed
framework can synthesize programs that are significantly longer and more
complex than the programs considered in prior program synthesis works. Website
at https://thoughtp0lice.github.io/hnps_web/
Related papers
- A Conversational Paradigm for Program Synthesis [110.94409515865867]
We propose a conversational program synthesis approach via large language models.
We train a family of large language models, called CodeGen, on natural language and programming language data.
Our findings show the emergence of conversational capabilities and the effectiveness of the proposed conversational program synthesis paradigm.
arXiv Detail & Related papers (2022-03-25T06:55:15Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Latent Programmer: Discrete Latent Codes for Program Synthesis [56.37993487589351]
In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
arXiv Detail & Related papers (2020-12-01T10:11:35Z) - BUSTLE: Bottom-Up Program Synthesis Through Learning-Guided Exploration [72.88493072196094]
We present a new synthesis approach that leverages learning to guide a bottom-up search over programs.
In particular, we train a model to prioritize compositions of intermediate values during search conditioned on a set of input-output examples.
We show that the combination of learning and bottom-up search is remarkably effective, even with simple supervised learning approaches.
arXiv Detail & Related papers (2020-07-28T17:46:18Z) - Synthesize, Execute and Debug: Learning to Repair for Neural Program
Synthesis [81.54148730967394]
We propose SED, a neural program generation framework that incorporates synthesis, execution, and debug stages.
SED first produces initial programs using the neural program synthesizer component, then utilizes a neural program debugger to iteratively repair the generated programs.
On Karel, a challenging input-output program synthesis benchmark, SED reduces the error rate of the neural program synthesizer itself by a considerable margin, and outperforms the standard beam search for decoding.
arXiv Detail & Related papers (2020-07-16T04:15:47Z) - Program Synthesis with Pragmatic Communication [28.24612900419843]
This work introduces a new inductive bias derived by modeling the program synthesis task as rational communication.
A user study finds that end-user participants communicate more effectively with the pragmatic program synthesizer over a non-pragmatic one.
arXiv Detail & Related papers (2020-07-09T20:55:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.