LambdaBeam: Neural Program Search with Higher-Order Functions and
Lambdas
- URL: http://arxiv.org/abs/2306.02049v2
- Date: Sat, 28 Oct 2023 09:07:27 GMT
- Title: LambdaBeam: Neural Program Search with Higher-Order Functions and
Lambdas
- Authors: Kensen Shi, Hanjun Dai, Wen-Ding Li, Kevin Ellis, Charles Sutton
- Abstract summary: We design a search algorithm called LambdaBeam that can construct arbitrary functions that compose operations within a given DSL.
We train a neural policy network to choose which synthesiss to construct during search, and pass them as arguments to higher-order functions.
- Score: 44.919087759546635
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Search is an important technique in program synthesis that allows for
adaptive strategies such as focusing on particular search directions based on
execution results. Several prior works have demonstrated that neural models are
effective at guiding program synthesis searches. However, a common drawback of
those approaches is the inability to handle iterative loops, higher-order
functions, or lambda functions, thus limiting prior neural searches from
synthesizing longer and more general programs. We address this gap by designing
a search algorithm called LambdaBeam that can construct arbitrary lambda
functions that compose operations within a given DSL. We create semantic vector
representations of the execution behavior of the lambda functions and train a
neural policy network to choose which lambdas to construct during search, and
pass them as arguments to higher-order functions to perform looping
computations. Our experiments show that LambdaBeam outperforms neural,
symbolic, and LLM-based techniques in an integer list manipulation domain.
Related papers
- AbstractBeam: Enhancing Bottom-Up Program Synthesis using Library Learning [0.0]
AbstractBeam is a novel program synthesis framework designed to enhance LambdaBeam by leveraging Library Learning.
Our experiments demonstrate that AbstractBeam statistically significantly outperforms LambdaBeam in the integer list manipulation domain.
arXiv Detail & Related papers (2024-05-27T08:31:12Z) - Self-Attention Based Semantic Decomposition in Vector Symbolic Architectures [6.473177443214531]
We introduce a new variant of the resonator network, based on self-attention based update rules in iterative search problem.
Our algorithm enables a larger capacity for associative memory, enabling applications in many tasks like perception based pattern recognition, scene decomposition, and object reasoning.
arXiv Detail & Related papers (2024-03-20T00:37:19Z) - Use Your INSTINCT: INSTruction optimization for LLMs usIng Neural bandits Coupled with Transformers [66.823588073584]
Large language models (LLMs) have shown remarkable instruction-following capabilities and achieved impressive performances in various applications.
Recent work has used the query-efficient Bayesian optimization (BO) algorithm to automatically optimize the instructions given to black-box LLMs.
We propose a neural bandit algorithm which replaces the GP in BO by an NN surrogate to optimize instructions for black-box LLMs.
arXiv Detail & Related papers (2023-10-02T02:01:16Z) - Parameter-free version of Adaptive Gradient Methods for Strongly-Convex
Functions [0.0]
In this paper, we adapt a universal algorithm along the lines of Metagrad, to get rid of this dependence on lambda and eta.
The main idea is to concurrently run multiple experts and combine their predictions to a master algorithm.
arXiv Detail & Related papers (2023-06-11T07:46:22Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Efficient Neural Network Analysis with Sum-of-Infeasibilities [64.31536828511021]
Inspired by sum-of-infeasibilities methods in convex optimization, we propose a novel procedure for analyzing verification queries on networks with extensive branching functions.
An extension to a canonical case-analysis-based complete search procedure can be achieved by replacing the convex procedure executed at each search state with DeepSoI.
arXiv Detail & Related papers (2022-03-19T15:05:09Z) - Bayesian Optimization over Permutation Spaces [30.650753803587794]
We propose and evaluate two algorithms for BO over Permutation Spaces (BOPS)
We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly.
Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for spaces.
arXiv Detail & Related papers (2021-12-02T08:20:50Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Differentiable Top-k Operator with Optimal Transport [135.36099648554054]
The SOFT top-k operator approximates the output of the top-k operation as the solution of an Entropic Optimal Transport (EOT) problem.
We apply the proposed operator to the k-nearest neighbors and beam search algorithms, and demonstrate improved performance.
arXiv Detail & Related papers (2020-02-16T04:57:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.