Toward Neural-Network-Guided Program Synthesis and Verification
- URL: http://arxiv.org/abs/2103.09414v1
- Date: Wed, 17 Mar 2021 03:09:05 GMT
- Title: Toward Neural-Network-Guided Program Synthesis and Verification
- Authors: Naoki Kobayashi, Taro Sekiyama, Issei Sato and Hiroshi Unno
- Abstract summary: We propose a novel framework of program and invariant synthesis called neural network-guided synthesis.
We first show that, by designing and training neural networks, we can extract logical formulas over integers from the weights and biases of the trained neural networks.
Based on the idea, we have implemented a tool to synthesize formulas from positive/negative examples and implication constraints.
- Score: 26.706421573322952
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a novel framework of program and invariant synthesis called neural
network-guided synthesis. We first show that, by suitably designing and
training neural networks, we can extract logical formulas over integers from
the weights and biases of the trained neural networks. Based on the idea, we
have implemented a tool to synthesize formulas from positive/negative examples
and implication constraints, and obtained promising experimental results. We
also discuss two applications of our synthesis method. One is the use of our
tool for qualifier discovery in the framework of ICE-learning-based CHC
solving, which can in turn be applied to program verification and inductive
invariant synthesis. Another application is to a new program development
framework called oracle-based programming, which is a neural-network-guided
variation of Solar-Lezama's program synthesis by sketching.
Related papers
- Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - The Predictive Forward-Forward Algorithm [79.07468367923619]
We propose the predictive forward-forward (PFF) algorithm for conducting credit assignment in neural systems.
We design a novel, dynamic recurrent neural system that learns a directed generative circuit jointly and simultaneously with a representation circuit.
PFF efficiently learns to propagate learning signals and updates synapses with forward passes only.
arXiv Detail & Related papers (2023-01-04T05:34:48Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Geometry of Program Synthesis [4.83420384410068]
We re-evaluate universal computation based on the synthesis of Turing machines.
This leads to a view of programs as singularities of analytic varieties or, equivalently, as phases of the Bayesian posterior of a synthesis problem.
arXiv Detail & Related papers (2021-03-30T05:14:15Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Optimal Neural Program Synthesis from Multimodal Specifications [45.35689345004124]
Multimodal program synthesis is an attractive way to scale program synthesis to challenging settings.
This paper proposes an optimal neural synthesis approach where the goal is to find a program that satisfies user-provided constraints.
arXiv Detail & Related papers (2020-10-04T20:51:21Z) - Type-driven Neural Programming by Example [0.0]
We look into programming by example (PBE), which is about finding a program mapping given inputs to given outputs.
We propose a way to incorporate programming types into a neural program synthesis approach for PBE.
arXiv Detail & Related papers (2020-08-28T12:30:05Z) - Synthesize, Execute and Debug: Learning to Repair for Neural Program
Synthesis [81.54148730967394]
We propose SED, a neural program generation framework that incorporates synthesis, execution, and debug stages.
SED first produces initial programs using the neural program synthesizer component, then utilizes a neural program debugger to iteratively repair the generated programs.
On Karel, a challenging input-output program synthesis benchmark, SED reduces the error rate of the neural program synthesizer itself by a considerable margin, and outperforms the standard beam search for decoding.
arXiv Detail & Related papers (2020-07-16T04:15:47Z) - CounterExample Guided Neural Synthesis [12.742347465894586]
Program synthesis is difficult and methods that provide formal guarantees suffer from scalability issues.
We combine neural networks with formal reasoning to convert a logical specification into a sequence of examples that guides the neural network towards a correct solution.
Our results show that the formal reasoning based guidance improves the performance of the neural network substantially.
arXiv Detail & Related papers (2020-01-25T01:11:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.