Searching for More Efficient Dynamic Programs
- URL: http://arxiv.org/abs/2109.06966v1
- Date: Tue, 14 Sep 2021 20:52:55 GMT
- Title: Searching for More Efficient Dynamic Programs
- Authors: Tim Vieira and Ryan Cotterell and Jason Eisner
- Abstract summary: We describe a set of program transformations, a simple metric for assessing the efficiency of a transformed program, and a search procedure to improve this metric.
We show that in practice, automated search can find substantial improvements to the initial program.
- Score: 61.79535031840558
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computational models of human language often involve combinatorial problems.
For instance, a probabilistic parser may marginalize over exponentially many
trees to make predictions. Algorithms for such problems often employ dynamic
programming and are not always unique. Finding one with optimal asymptotic
runtime can be unintuitive, time-consuming, and error-prone. Our work aims to
automate this laborious process. Given an initial correct declarative program,
we search for a sequence of semantics-preserving transformations to improve its
running time as much as possible. To this end, we describe a set of program
transformations, a simple metric for assessing the efficiency of a transformed
program, and a heuristic search procedure to improve this metric. We show that
in practice, automated search -- like the mental search performed by human
programmers -- can find substantial improvements to the initial program.
Empirically, we show that many common speed-ups described in the NLP literature
could have been discovered automatically by our system.
Related papers
- Searching Latent Program Spaces [0.0]
We propose an algorithm for program induction that learns a distribution over latent programs in a continuous space, enabling efficient search and test-time adaptation.
We show that can generalize beyond its training distribution and adapt to unseen tasks by utilizing test-time adaptation mechanisms.
arXiv Detail & Related papers (2024-11-13T15:50:32Z) - Efficient Bottom-Up Synthesis for Programs with Local Variables [7.387053183440393]
Our algorithm can efficiently search programs with local variables.
Lifted interpretation provides a mechanism to enumerate all binding contexts for local variables.
Our ideas are instantiated in the domain of web automation.
arXiv Detail & Related papers (2023-11-07T04:02:52Z) - Improved Tree Search for Automatic Program Synthesis [91.3755431537592]
A key element is being able to perform an efficient search in the space of valid programs.
Here, we suggest a variant of MCTS that leads to state of the art results on two vastly different DSLs.
arXiv Detail & Related papers (2023-03-13T15:09:52Z) - Learning to compile smartly for program size reduction [35.58682369218936]
We propose a novel approach that learns a policy to select passes for program size reduction.
Our approach uses a search mechanism that helps identify useful pass sequences and a GNN with customized attention that selects the optimal sequence to use.
arXiv Detail & Related papers (2023-01-09T16:42:35Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Recent Developments in Program Synthesis with Evolutionary Algorithms [1.8047694351309207]
We identify the relevant evolutionary program synthesis approaches and provide an in-depth analysis of their performance.
The most influential approaches we identify are stack-based, grammar-guided, as well as linear genetic programming.
For future work, we encourage researchers not only to use a program's output for assessing the quality of a solution but also the way towards a solution.
arXiv Detail & Related papers (2021-08-27T11:38:27Z) - Towards Optimally Efficient Tree Search with Deep Learning [76.64632985696237]
This paper investigates the classical integer least-squares problem which estimates signals integer from linear models.
The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning.
We propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal estimation for the underlying simplified memory-bounded A* algorithm.
arXiv Detail & Related papers (2021-01-07T08:00:02Z) - Latent Programmer: Discrete Latent Codes for Program Synthesis [56.37993487589351]
In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
arXiv Detail & Related papers (2020-12-01T10:11:35Z) - Learning Differentiable Programs with Admissible Neural Heuristics [43.54820901841979]
We study the problem of learning differentiable functions expressed as programs in a domain-specific language.
We frame this optimization problem as a search in a weighted graph whose paths encode top-down derivations of program syntax.
Our key innovation is to view various classes of neural networks as continuous relaxations over the space of programs.
arXiv Detail & Related papers (2020-07-23T16:07:39Z) - Strong Generalization and Efficiency in Neural Programs [69.18742158883869]
We study the problem of learning efficient algorithms that strongly generalize in the framework of neural program induction.
By carefully designing the input / output interfaces of the neural model and through imitation, we are able to learn models that produce correct results for arbitrary input sizes.
arXiv Detail & Related papers (2020-07-07T17:03:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.