AbstractBeam: Enhancing Bottom-Up Program Synthesis using Library Learning
- URL: http://arxiv.org/abs/2405.17514v3
- Date: Thu, 12 Sep 2024 06:11:50 GMT
- Title: AbstractBeam: Enhancing Bottom-Up Program Synthesis using Library Learning
- Authors: Janis Zenkner, Lukas Dierkes, Tobias Sesterhenn, Chrisitan Bartelt,
- Abstract summary: AbstractBeam is a novel program synthesis framework designed to enhance LambdaBeam by leveraging Library Learning.
Our experiments demonstrate that AbstractBeam statistically significantly outperforms LambdaBeam in the integer list manipulation domain.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: LambdaBeam is a state-of-the-art, execution-guided algorithm for program synthesis that utilizes higher-order functions, lambda functions, and iterative loops within a Domain-Specific Language (DSL). LambdaBeam generates each program from scratch but does not take advantage of the frequent recurrence of program blocks or subprograms commonly found in specific domains, such as loops for list traversal. To address this limitation, we introduce AbstractBeam: a novel program synthesis framework designed to enhance LambdaBeam by leveraging Library Learning. AbstractBeam identifies and integrates recurring program structures into the DSL, optimizing the synthesis process. Our experimental evaluations demonstrate that AbstractBeam statistically significantly (p < 0.05) outperforms LambdaBeam in the integer list manipulation domain. Beyond solving more tasks, AbstractBeam's program synthesis is also more efficient, requiring less time and fewer candidate programs to generate a solution. Furthermore, our findings indicate that Library Learning effectively enhances program synthesis in domains that are not explicitly designed to showcase its advantages, thereby highlighting the broader applicability of Library Learning.
Related papers
- Guiding Enumerative Program Synthesis with Large Language Models [15.500250058226474]
In this paper, we evaluate the abilities of Large Language Models to solve formal synthesis benchmarks.
When one-shot synthesis fails, we propose a novel enumerative synthesis algorithm.
We find that GPT-3.5 as a stand-alone tool for formal synthesis is easily outperformed by state-of-the-art formal synthesis algorithms.
arXiv Detail & Related papers (2024-03-06T19:13:53Z) - LILO: Learning Interpretable Libraries by Compressing and Documenting Code [71.55208585024198]
We introduce LILO, a neurosymbolic framework that iteratively synthesizes, compresses, and documents code.
LILO combines LLM-guided program synthesis with recent algorithmic advances in automated from Stitch.
We find that AutoDoc boosts performance by helping LILO's synthesizer to interpret and deploy learned abstractions.
arXiv Detail & Related papers (2023-10-30T17:55:02Z) - LambdaBeam: Neural Program Search with Higher-Order Functions and
Lambdas [44.919087759546635]
We design a search algorithm called LambdaBeam that can construct arbitrary functions that compose operations within a given DSL.
We train a neural policy network to choose which synthesiss to construct during search, and pass them as arguments to higher-order functions.
arXiv Detail & Related papers (2023-06-03T08:24:53Z) - Top-Down Synthesis for Library Learning [46.285220926554345]
corpus-guided top-down synthesis is a mechanism for synthesizing library functions that capture common functionality from a corpus of programs.
We present an implementation of the approach in a tool called Stitch and evaluate it against the state-of-the-art deductive library learning algorithm from DreamCoder.
arXiv Detail & Related papers (2022-11-29T21:57:42Z) - Summarization Programs: Interpretable Abstractive Summarization with
Neural Modular Trees [89.60269205320431]
Current abstractive summarization models either suffer from a lack of clear interpretability or provide incomplete rationales.
We propose the Summarization Program (SP), an interpretable modular framework consisting of an (ordered) list of binary trees.
A Summarization Program contains one root node per summary sentence, and a distinct tree connects each summary sentence to the document sentences.
arXiv Detail & Related papers (2022-09-21T16:50:22Z) - Leveraging Language to Learn Program Abstractions and Search Heuristics [66.28391181268645]
We introduce LAPS (Language for Abstraction and Program Search), a technique for using natural language annotations to guide joint learning of libraries and neurally-guided search models for synthesis.
When integrated into a state-of-the-art library learning system (DreamCoder), LAPS produces higher-quality libraries and improves search efficiency and generalization.
arXiv Detail & Related papers (2021-06-18T15:08:47Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Latent Programmer: Discrete Latent Codes for Program Synthesis [56.37993487589351]
In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
arXiv Detail & Related papers (2020-12-01T10:11:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.