Code Building Genetic Programming
- URL: http://arxiv.org/abs/2008.03649v1
- Date: Sun, 9 Aug 2020 04:33:04 GMT
- Title: Code Building Genetic Programming
- Authors: Edward Pantridge, Lee Spector
- Abstract summary: We introduce Code Building Genetic Programming (CBGP) as a framework within which this can be done.
CBGP produces a computational graph that can be executed or translated into source code of a host language.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years the field of genetic programming has made significant
advances towards automatic programming. Research and development of
contemporary program synthesis methods, such as PushGP and Grammar Guided
Genetic Programming, can produce programs that solve problems typically
assigned in introductory academic settings. These problems focus on a narrow,
predetermined set of simple data structures, basic control flow patterns, and
primitive, non-overlapping data types (without, for example, inheritance or
composite types). Few, if any, genetic programming methods for program
synthesis have convincingly demonstrated the capability of synthesizing
programs that use arbitrary data types, data structures, and specifications
that are drawn from existing codebases. In this paper, we introduce Code
Building Genetic Programming (CBGP) as a framework within which this can be
done, by leveraging programming language features such as reflection and
first-class specifications. CBGP produces a computational graph that can be
executed or translated into source code of a host language. To demonstrate the
novel capabilities of CBGP, we present results on new benchmarks that use
non-primitive, polymorphic data types as well as some standard program
synthesis benchmarks.
Related papers
- Genetic Instruct: Scaling up Synthetic Generation of Coding Instructions for Large Language Models [54.51932175059004]
We introduce a scalable method for generating synthetic instructions to enhance the code generation capability of Large Language Models.
The proposed algorithm, Genetic-Instruct, mimics evolutionary processes, utilizing self-instruction to create numerous synthetic samples from a limited number of seeds.
arXiv Detail & Related papers (2024-07-29T20:42:59Z) - NoviCode: Generating Programs from Natural Language Utterances by Novices [59.71218039095155]
We present NoviCode, a novel NL Programming task which takes as input an API and a natural language description by a novice non-programmer.
We show that NoviCode is indeed a challenging task in the code synthesis domain, and that generating complex code from non-technical instructions goes beyond the current Text-to-Code paradigm.
arXiv Detail & Related papers (2024-07-15T11:26:03Z) - CodeGRAG: Bridging the Gap between Natural Language and Programming Language via Graphical Retrieval Augmented Generation [58.84212778960507]
We propose CodeGRAG, a Graphical Retrieval Augmented Code Generation framework to enhance the performance of LLMs.
CodeGRAG builds the graphical view of code blocks based on the control flow and data flow of them to fill the gap between programming languages and natural language.
Various experiments and ablations are done on four datasets including both the C++ and python languages to validate the hard meta-graph prompt, the soft prompting technique, and the effectiveness of the objectives for pretrained GNN expert.
arXiv Detail & Related papers (2024-05-03T02:48:55Z) - Solving Novel Program Synthesis Problems with Genetic Programming using
Parametric Polymorphism [0.0]
We show that Code-building Genetic Programming (CBGP) compiles type-safe programs from linear genomes using stack-based compilation and a formal type system.
CBGP is able to solve problems with all of these properties, where every other GP system that we know of has restrictions that make it unable to even consider problems with these properties.
arXiv Detail & Related papers (2023-06-08T00:10:07Z) - HOTGP -- Higher-Order Typed Genetic Programming [0.0]
HOTGP is a new genetic algorithm that synthesizes pure, typed, and functional programs.
The grammar is based on Haskell's standard base library.
arXiv Detail & Related papers (2023-04-06T16:23:34Z) - Planning with Large Language Models for Code Generation [100.07232672883897]
Planning-Guided Transformer Decoding (PG-TD) uses a planning algorithm to do lookahead search and guide the Transformer to generate better programs.
We empirically evaluate our framework with several large language models as backbones on public coding challenge benchmarks.
arXiv Detail & Related papers (2023-03-09T18:59:47Z) - Data types as a more ergonomic frontend for Grammar-Guided Genetic
Programming [0.0]
We propose to embed the grammar as an internal Domain-Specific Language in the host language of the framework.
This approach has the same expressive power as BNF and EBNF while using the host language type-system.
We also present Meta-Handlers, user-defined overrides of the tree-generation system.
arXiv Detail & Related papers (2022-10-10T16:38:16Z) - Functional Code Building Genetic Programming [0.0]
Code Building Genetic Programming (CBGP) is a recently introduced GP method for general program synthesis.
We show that a functional programming language and a Hindley-Milner type system can be used to evolve type-safe programs.
arXiv Detail & Related papers (2022-06-09T15:22:33Z) - A Conversational Paradigm for Program Synthesis [110.94409515865867]
We propose a conversational program synthesis approach via large language models.
We train a family of large language models, called CodeGen, on natural language and programming language data.
Our findings show the emergence of conversational capabilities and the effectiveness of the proposed conversational program synthesis paradigm.
arXiv Detail & Related papers (2022-03-25T06:55:15Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Type-driven Neural Programming by Example [0.0]
We look into programming by example (PBE), which is about finding a program mapping given inputs to given outputs.
We propose a way to incorporate programming types into a neural program synthesis approach for PBE.
arXiv Detail & Related papers (2020-08-28T12:30:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.