Geometry of Program Synthesis
- URL: http://arxiv.org/abs/2103.16080v1
- Date: Tue, 30 Mar 2021 05:14:15 GMT
- Title: Geometry of Program Synthesis
- Authors: James Clift, Daniel Murfet, James Wallbridge
- Abstract summary: We re-evaluate universal computation based on the synthesis of Turing machines.
This leads to a view of programs as singularities of analytic varieties or, equivalently, as phases of the Bayesian posterior of a synthesis problem.
- Score: 4.83420384410068
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We re-evaluate universal computation based on the synthesis of Turing
machines. This leads to a view of programs as singularities of analytic
varieties or, equivalently, as phases of the Bayesian posterior of a synthesis
problem. This new point of view reveals unexplored directions of research in
program synthesis, of which neural networks are a subset, for example in
relation to phase transitions, complexity and generalisation. We also lay the
empirical foundations for these new directions by reporting on our
implementation in code of some simple experiments.
Related papers
- Re-evaluating Retrosynthesis Algorithms with Syntheseus [13.384695742156152]
We present a synthesis planning library with an extensive benchmarking framework, called syntheseus.
We demonstrate the capabilities of syntheseus by re-evaluating several previous retrosynthesis algorithms.
We end with guidance for future works in this area, and call the community to engage in the discussion on how to improve benchmarks for synthesis planning.
arXiv Detail & Related papers (2023-10-30T17:59:04Z) - ExeDec: Execution Decomposition for Compositional Generalization in Neural Program Synthesis [54.18659323181771]
We characterize several different forms of compositional generalization that are desirable in program synthesis.
We propose ExeDec, a novel decomposition-based strategy that predicts execution subgoals to solve problems step-by-step informed by program execution at each step.
arXiv Detail & Related papers (2023-07-26T01:07:52Z) - A Divide-Align-Conquer Strategy for Program Synthesis [8.595181704811889]
We show that compositional segmentation can be applied in the programming by examples setting to divide the search for large programs across multiple smaller program synthesis problems.
A structural alignment of the constituent parts in the input and output leads to pairwise correspondences used to guide the program search.
arXiv Detail & Related papers (2023-01-08T19:10:55Z) - FusionRetro: Molecule Representation Fusion via In-Context Learning for
Retrosynthetic Planning [58.47265392465442]
Retrosynthetic planning aims to devise a complete multi-step synthetic route from starting materials to a target molecule.
Current strategies use a decoupled approach of single-step retrosynthesis models and search algorithms.
We propose a novel framework that utilizes context information for improved retrosynthetic planning.
arXiv Detail & Related papers (2022-09-30T08:44:58Z) - Compositional Generalization and Decomposition in Neural Program
Synthesis [59.356261137313275]
In this paper, we focus on measuring the ability of learned program synthesizers to compositionally generalize.
We first characterize several different axes along which program synthesis methods would be desired to generalize.
We introduce a benchmark suite of tasks to assess these abilities based on two popular existing datasets.
arXiv Detail & Related papers (2022-04-07T22:16:05Z) - Scaling Neural Program Synthesis with Distribution-based Search [7.137293485620867]
We introduce two new search algorithms: Heap Search and SQRT Sampling.
We show how they integrate with probabilistic and neural techniques, and demonstrate how they can operate at scale across parallel compute environments.
arXiv Detail & Related papers (2021-10-24T16:46:01Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Toward Neural-Network-Guided Program Synthesis and Verification [26.706421573322952]
We propose a novel framework of program and invariant synthesis called neural network-guided synthesis.
We first show that, by designing and training neural networks, we can extract logical formulas over integers from the weights and biases of the trained neural networks.
Based on the idea, we have implemented a tool to synthesize formulas from positive/negative examples and implication constraints.
arXiv Detail & Related papers (2021-03-17T03:09:05Z) - RetroXpert: Decompose Retrosynthesis Prediction like a Chemist [60.463900712314754]
We devise a novel template-free algorithm for automatic retrosynthetic expansion.
Our method disassembles retrosynthesis into two steps.
While outperforming the state-of-the-art baselines, our model also provides chemically reasonable interpretation.
arXiv Detail & Related papers (2020-11-04T04:35:34Z) - Creating Synthetic Datasets via Evolution for Neural Program Synthesis [77.34726150561087]
We show that some program synthesis approaches generalize poorly to data distributions different from that of the randomly generated examples.
We propose a new, adversarial approach to control the bias of synthetic data distributions and show that it outperforms current approaches.
arXiv Detail & Related papers (2020-03-23T18:34:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.