Genetic Algorithms for Searching a Matrix of Metagrammars for Synthesis
- URL: http://arxiv.org/abs/2306.00521v2
- Date: Sun, 4 Jun 2023 11:46:42 GMT
- Title: Genetic Algorithms for Searching a Matrix of Metagrammars for Synthesis
- Authors: Yixuan Li, Federico Mora, Elizabeth Polgreen, Sanjit A. Seshia
- Abstract summary: Syntax-guided synthesis is a paradigm in which the search space of candidate solutions is constrained by a syntactic template in the form of a grammar.
In this work, we frame the space of syntactic templates as a matrix of rules, and demonstrate how this matrix can be searched effectively with little training data.
- Score: 19.044613696320628
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Syntax-guided synthesis is a paradigm in program synthesis in which the
search space of candidate solutions is constrained by a syntactic template in
the form of a grammar. These syntactic constraints serve two purposes:
constraining the language to the space the user desires, but also rendering the
search space tractable for the synthesizer. Given a well-written syntactic
template, this is an extremely effective technique. However, this is highly
dependent on the user providing such a template: a syntactic template that is
too large results in a larger search space and slower synthesis, and a
syntactic template that is too small may not contain the solution needed. In
this work, we frame the space of syntactic templates as a matrix of rules, and
demonstrate how this matrix can be searched effectively with little training
data using simple search techniques such as genetic algorithms, giving
improvements in both the number of benchmarks solved and solving time for the
state-of-the-art synthesis solver.
Related papers
- Syntax-Guided Procedural Synthesis of Molecules [26.87587877386068]
Design of synthetically accessible molecules and recommending analogs to unsynthesizable molecules are important problems for accelerating molecular discovery.
We reconceptualize both problems using ideas from program synthesis.
We decouple the syntactic skeleton from the semantics of a synthetic tree to create a bilevel framework for reasoning about the space of synthesis pathways.
arXiv Detail & Related papers (2024-08-24T04:32:36Z) - Genetic Instruct: Scaling up Synthetic Generation of Coding Instructions for Large Language Models [54.51932175059004]
We introduce a scalable method for generating synthetic instructions to enhance the code generation capability of Large Language Models.
The proposed algorithm, Genetic-Instruct, mimics evolutionary processes, utilizing self-instruction to create numerous synthetic samples from a limited number of seeds.
arXiv Detail & Related papers (2024-07-29T20:42:59Z) - HYSYNTH: Context-Free LLM Approximation for Guiding Program Synthesis [25.260063704712458]
Large language models (LLMs) often fail to produce fully correct programs in unfamiliar DSLs.
Motivated by these limitations, we introduce a hybrid approach, where LLM completions for a given task are used to learn a task-specific, context-free surrogate model.
We evaluate this hybrid approach on three domains, and show that it outperforms both unguided search and direct sampling from LLMs, as well as existing program synthesizers.
arXiv Detail & Related papers (2024-05-24T18:45:51Z) - SynthesizRR: Generating Diverse Datasets with Retrieval Augmentation [55.2480439325792]
We study the synthesis of six datasets, covering topic classification, sentiment analysis, tone detection, and humor.
We find that SynthesizRR greatly improves lexical and semantic diversity, similarity to human-written text, and distillation performance.
arXiv Detail & Related papers (2024-05-16T12:22:41Z) - Guiding Enumerative Program Synthesis with Large Language Models [15.500250058226474]
In this paper, we evaluate the abilities of Large Language Models to solve formal synthesis benchmarks.
When one-shot synthesis fails, we propose a novel enumerative synthesis algorithm.
We find that GPT-3.5 as a stand-alone tool for formal synthesis is easily outperformed by state-of-the-art formal synthesis algorithms.
arXiv Detail & Related papers (2024-03-06T19:13:53Z) - A Quality-based Syntactic Template Retriever for
Syntactically-controlled Paraphrase Generation [67.98367574025797]
Existing syntactically-controlled paraphrase generation models perform promisingly with human-annotated or well-chosen syntactic templates.
The prohibitive cost makes it unfeasible to manually design decent templates for every source sentence.
We propose a novel Quality-based Syntactic Template Retriever (QSTR) to retrieve templates based on the quality of the to-be-generated paraphrases.
arXiv Detail & Related papers (2023-10-20T03:55:39Z) - Satisfiability and Synthesis Modulo Oracles [7.246701762489972]
Many synthesis algorithms use a white-box oracle based on satisfiability modulo theory (SMT) solvers to provide counterexamples.
We present a framework for solving a general class of oracle-guided synthesis problems which we term synthesis modulo oracles.
We also formalize the problem of satisfiability modulo theories and oracles, and present an algorithm for solving this problem.
arXiv Detail & Related papers (2021-07-28T16:36:26Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Syntactic representation learning for neural network based TTS with
syntactic parse tree traversal [49.05471750563229]
We propose a syntactic representation learning method based on syntactic parse tree to automatically utilize the syntactic structure information.
Experimental results demonstrate the effectiveness of our proposed approach.
For sentences with multiple syntactic parse trees, prosodic differences can be clearly perceived from the synthesized speeches.
arXiv Detail & Related papers (2020-12-13T05:52:07Z) - Latent Programmer: Discrete Latent Codes for Program Synthesis [56.37993487589351]
In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
arXiv Detail & Related papers (2020-12-01T10:11:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.