Learning-Based Automatic Synthesis of Software Code and Configuration
- URL: http://arxiv.org/abs/2305.15642v2
- Date: Tue, 30 May 2023 06:05:23 GMT
- Title: Learning-Based Automatic Synthesis of Software Code and Configuration
- Authors: Shantanu Mandal
- Abstract summary: Large scale automatic software generation and configuration is a very complex and challenging task.
In first task, we propose to synthesize software automatically with input output specifications.
For the second task, we propose to synthesize configurations of large scale software from different input files.
- Score: 0.951828574518325
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Increasing demands in software industry and scarcity of software engineers
motivates researchers and practitioners to automate the process of software
generation and configuration. Large scale automatic software generation and
configuration is a very complex and challenging task. In this proposal, we set
out to investigate this problem by breaking down automatic software generation
and configuration into two different tasks. In first task, we propose to
synthesize software automatically with input output specifications. This task
is further broken down into two sub-tasks. The first sub-task is about
synthesizing programs with a genetic algorithm which is driven by a neural
network based fitness function trained with program traces and specifications.
For the second sub-task, we formulate program synthesis as a continuous
optimization problem and synthesize programs with covariance matrix adaption
evolutionary strategy (a state-of-the-art continuous optimization method).
Finally, for the second task, we propose to synthesize configurations of large
scale software from different input files (e.g. software manuals,
configurations files, online blogs, etc.) using a sequence-to-sequence deep
learning mechanism.
Related papers
- IPSynth: Interprocedural Program Synthesis for Software Security Implementation [3.1119394814248253]
We introduce IP Synth, a novel inter-procedural program synthesis approach that automatically learns the specification of the tactic.
Our results show that our approach can accurately locate corresponding spots in the program, synthesize needed code snippets, add them to the program, and outperform ChatGPT in inter-procedural tactic synthesis tasks.
arXiv Detail & Related papers (2024-03-16T07:12:24Z) - Hierarchical Neural Program Synthesis [19.94176152035497]
Program synthesis aims to automatically construct human-readable programs that satisfy given task specifications.
We present a scalable program synthesis framework that instead synthesizes a program by hierarchically composing programs.
We extensively evaluate our proposed framework in a string transformation domain with input/output pairs.
arXiv Detail & Related papers (2023-03-09T18:20:07Z) - Synthesizing Programs with Continuous Optimization [4.457604452495174]
We present a novel formulation of program synthesis as a continuous optimization problem.
We then propose a mapping scheme to convert the continuous formulation into actual programs.
arXiv Detail & Related papers (2022-11-02T02:12:10Z) - A Conversational Paradigm for Program Synthesis [110.94409515865867]
We propose a conversational program synthesis approach via large language models.
We train a family of large language models, called CodeGen, on natural language and programming language data.
Our findings show the emergence of conversational capabilities and the effectiveness of the proposed conversational program synthesis paradigm.
arXiv Detail & Related papers (2022-03-25T06:55:15Z) - Searching for More Efficient Dynamic Programs [61.79535031840558]
We describe a set of program transformations, a simple metric for assessing the efficiency of a transformed program, and a search procedure to improve this metric.
We show that in practice, automated search can find substantial improvements to the initial program.
arXiv Detail & Related papers (2021-09-14T20:52:55Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Amortized Synthesis of Constrained Configurations Using a Differentiable
Surrogate [25.125736560730864]
In design, fabrication, and control problems, we are often faced with the task of synthesis.
This many-to-one map presents challenges to the supervised learning of feed-forward synthesis.
We address both of these problems with a two-stage neural network architecture that we may consider to be an autoencoder.
arXiv Detail & Related papers (2021-06-16T17:59:45Z) - Latent Programmer: Discrete Latent Codes for Program Synthesis [56.37993487589351]
In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
arXiv Detail & Related papers (2020-12-01T10:11:35Z) - BUSTLE: Bottom-Up Program Synthesis Through Learning-Guided Exploration [72.88493072196094]
We present a new synthesis approach that leverages learning to guide a bottom-up search over programs.
In particular, we train a model to prioritize compositions of intermediate values during search conditioned on a set of input-output examples.
We show that the combination of learning and bottom-up search is remarkably effective, even with simple supervised learning approaches.
arXiv Detail & Related papers (2020-07-28T17:46:18Z) - Synthesize, Execute and Debug: Learning to Repair for Neural Program
Synthesis [81.54148730967394]
We propose SED, a neural program generation framework that incorporates synthesis, execution, and debug stages.
SED first produces initial programs using the neural program synthesizer component, then utilizes a neural program debugger to iteratively repair the generated programs.
On Karel, a challenging input-output program synthesis benchmark, SED reduces the error rate of the neural program synthesizer itself by a considerable margin, and outperforms the standard beam search for decoding.
arXiv Detail & Related papers (2020-07-16T04:15:47Z) - Evaluating Sequence-to-Sequence Learning Models for If-Then Program
Synthesis [0.0]
A building block of process automations are If-Then programs.
In the consumer space, sites like IFTTT and allow users to create automations by defining If-Then programs using a graphical interface.
We find Seq2Seq approaches have high potential (performing strongly on the sequence recipes) and can serve as a promising approach to synthesis more complex program challenges.
arXiv Detail & Related papers (2020-02-10T00:45:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.