Latent Programmer: Discrete Latent Codes for Program Synthesis
- URL: http://arxiv.org/abs/2012.00377v1
- Date: Tue, 1 Dec 2020 10:11:35 GMT
- Title: Latent Programmer: Discrete Latent Codes for Program Synthesis
- Authors: Joey Hong and David Dohan and Rishabh Singh and Charles Sutton and
Manzil Zaheer
- Abstract summary: In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
- Score: 56.37993487589351
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many sequence learning tasks, such as program synthesis and document
summarization, a key problem is searching over a large space of possible output
sequences. We propose to learn representations of the outputs that are
specifically meant for search: rich enough to specify the desired output but
compact enough to make search more efficient. Discrete latent codes are
appealing for this purpose, as they naturally allow sophisticated combinatorial
search strategies. The latent codes are learned using a self-supervised
learning principle, in which first a discrete autoencoder is trained on the
output sequences, and then the resulting latent codes are used as intermediate
targets for the end-to-end sequence prediction task. Based on these insights,
we introduce the \emph{Latent Programmer}, a program synthesis method that
first predicts a discrete latent code from input/output examples, and then
generates the program in the target language. We evaluate the Latent Programmer
on two domains: synthesis of string transformation programs, and generation of
programs from natural language descriptions. We demonstrate that the discrete
latent representation significantly improves synthesis accuracy.
Related papers
- Improved Tree Search for Automatic Program Synthesis [91.3755431537592]
A key element is being able to perform an efficient search in the space of valid programs.
Here, we suggest a variant of MCTS that leads to state of the art results on two vastly different DSLs.
arXiv Detail & Related papers (2023-03-13T15:09:52Z) - Hierarchical Neural Program Synthesis [19.94176152035497]
Program synthesis aims to automatically construct human-readable programs that satisfy given task specifications.
We present a scalable program synthesis framework that instead synthesizes a program by hierarchically composing programs.
We extensively evaluate our proposed framework in a string transformation domain with input/output pairs.
arXiv Detail & Related papers (2023-03-09T18:20:07Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Leveraging Language to Learn Program Abstractions and Search Heuristics [66.28391181268645]
We introduce LAPS (Language for Abstraction and Program Search), a technique for using natural language annotations to guide joint learning of libraries and neurally-guided search models for synthesis.
When integrated into a state-of-the-art library learning system (DreamCoder), LAPS produces higher-quality libraries and improves search efficiency and generalization.
arXiv Detail & Related papers (2021-06-18T15:08:47Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - BUSTLE: Bottom-Up Program Synthesis Through Learning-Guided Exploration [72.88493072196094]
We present a new synthesis approach that leverages learning to guide a bottom-up search over programs.
In particular, we train a model to prioritize compositions of intermediate values during search conditioned on a set of input-output examples.
We show that the combination of learning and bottom-up search is remarkably effective, even with simple supervised learning approaches.
arXiv Detail & Related papers (2020-07-28T17:46:18Z) - Program Synthesis with Pragmatic Communication [28.24612900419843]
This work introduces a new inductive bias derived by modeling the program synthesis task as rational communication.
A user study finds that end-user participants communicate more effectively with the pragmatic program synthesizer over a non-pragmatic one.
arXiv Detail & Related papers (2020-07-09T20:55:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.