Iterative Genetic Improvement: Scaling Stochastic Program Synthesis
- URL: http://arxiv.org/abs/2202.13040v1
- Date: Sat, 26 Feb 2022 02:00:35 GMT
- Title: Iterative Genetic Improvement: Scaling Stochastic Program Synthesis
- Authors: Yuan Yuan and Wolfgang Banzhaf
- Abstract summary: Program synthesis aims to it automatically find programs from an underlying programming language that satisfy a given specification.
Existing program synthesis techniques do not meet this expectation very well, suffering from the scalability issue.
Here we propose a new framework for program synthesis, called iterative genetic improvement to overcome this problem.
- Score: 11.195558777385259
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Program synthesis aims to {\it automatically} find programs from an
underlying programming language that satisfy a given specification. While this
has the potential to revolutionize computing, how to search over the vast space
of programs efficiently is an unsolved challenge in program synthesis. In cases
where large programs are required for a solution, it is generally believed that
{\it stochastic} search has advantages over other classes of search techniques.
Unfortunately, existing stochastic program synthesizers do not meet this
expectation very well, suffering from the scalability issue. Here we propose a
new framework for stochastic program synthesis, called iterative genetic
improvement to overcome this problem, a technique inspired by the practice of
the software development process. The key idea of iterative genetic improvement
is to apply genetic improvement to improve a current reference program, and
then iteratively replace the reference program by the best program found.
Compared to traditional stochastic synthesis approaches, iterative genetic
improvement can build up the complexity of programs incrementally in a more
robust way. We evaluate the approach on two program synthesis domains: list
manipulation and string transformation. Our empirical results indicate that
this method has considerable advantages over several representative stochastic
program synthesizer techniques, both in terms of scalability and of solution
quality.
Related papers
- Genetic Algorithm for Program Synthesis [6.85316573653194]
We improve the search strategy of a deductive program synthesis tool, SuSLik, using evolutionary computation.
Our cross-validation shows that the improvement brought by evolutionary computation generalises to unforeseen problems.
arXiv Detail & Related papers (2022-11-22T01:16:13Z) - Synthesizing Programs with Continuous Optimization [4.457604452495174]
We present a novel formulation of program synthesis as a continuous optimization problem.
We then propose a mapping scheme to convert the continuous formulation into actual programs.
arXiv Detail & Related papers (2022-11-02T02:12:10Z) - Compositional Generalization and Decomposition in Neural Program
Synthesis [59.356261137313275]
In this paper, we focus on measuring the ability of learned program synthesizers to compositionally generalize.
We first characterize several different axes along which program synthesis methods would be desired to generalize.
We introduce a benchmark suite of tasks to assess these abilities based on two popular existing datasets.
arXiv Detail & Related papers (2022-04-07T22:16:05Z) - Searching for More Efficient Dynamic Programs [61.79535031840558]
We describe a set of program transformations, a simple metric for assessing the efficiency of a transformed program, and a search procedure to improve this metric.
We show that in practice, automated search can find substantial improvements to the initial program.
arXiv Detail & Related papers (2021-09-14T20:52:55Z) - Recent Developments in Program Synthesis with Evolutionary Algorithms [1.8047694351309207]
We identify the relevant evolutionary program synthesis approaches and provide an in-depth analysis of their performance.
The most influential approaches we identify are stack-based, grammar-guided, as well as linear genetic programming.
For future work, we encourage researchers not only to use a program's output for assessing the quality of a solution but also the way towards a solution.
arXiv Detail & Related papers (2021-08-27T11:38:27Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Learning Differentiable Programs with Admissible Neural Heuristics [43.54820901841979]
We study the problem of learning differentiable functions expressed as programs in a domain-specific language.
We frame this optimization problem as a search in a weighted graph whose paths encode top-down derivations of program syntax.
Our key innovation is to view various classes of neural networks as continuous relaxations over the space of programs.
arXiv Detail & Related papers (2020-07-23T16:07:39Z) - Synthesize, Execute and Debug: Learning to Repair for Neural Program
Synthesis [81.54148730967394]
We propose SED, a neural program generation framework that incorporates synthesis, execution, and debug stages.
SED first produces initial programs using the neural program synthesizer component, then utilizes a neural program debugger to iteratively repair the generated programs.
On Karel, a challenging input-output program synthesis benchmark, SED reduces the error rate of the neural program synthesizer itself by a considerable margin, and outperforms the standard beam search for decoding.
arXiv Detail & Related papers (2020-07-16T04:15:47Z) - Creating Synthetic Datasets via Evolution for Neural Program Synthesis [77.34726150561087]
We show that some program synthesis approaches generalize poorly to data distributions different from that of the randomly generated examples.
We propose a new, adversarial approach to control the bias of synthetic data distributions and show that it outperforms current approaches.
arXiv Detail & Related papers (2020-03-23T18:34:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.