Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples
- URL: http://arxiv.org/abs/2210.16606v1
- Date: Sat, 29 Oct 2022 14:06:42 GMT
- Title: Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples
- Authors: Peter Belcak, Roger Wattenhofer
- Abstract summary: We propose a novel, fully explainable neural approach to synthesis of logic circuits from input-output examples.
Our method can be employed for a virtually arbitrary choice of atoms.
- Score: 10.482805367361818
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We propose a novel, fully explainable neural approach to synthesis of
combinatorial logic circuits from input-output examples. The carrying advantage
of our method is that it readily extends to inductive scenarios, where the set
of examples is incomplete but still indicative of the desired behaviour. Our
method can be employed for a virtually arbitrary choice of atoms - from logic
gates to FPGA blocks - as long as they can be formulated in a differentiable
fashion, and consistently yields good results for synthesis of practical
circuits of increasing size. In particular, we succeed in learning a number of
arithmetic, bitwise, and signal-routing operations, and even generalise towards
the correct behaviour in inductive scenarios. Our method, attacking a discrete
logical synthesis problem with an explainable neural approach, hints at a wider
promise for synthesis and reasoning-related tasks.
Related papers
- INVICTUS: Optimizing Boolean Logic Circuit Synthesis via Synergistic
Learning and Search [18.558280701880136]
State-of-the-art logic synthesis algorithms have a large number of logic minimizations.
INVICTUS generates a sequence of logic minimizations based on a training dataset of previously seen designs.
arXiv Detail & Related papers (2023-05-22T15:50:42Z) - Differentiable Inductive Logic Programming in High-Dimensional Space [6.21540494241516]
We propose extending the deltaILP approach to inductive synthesis with large-scale predicate invention.
We show that large-scale predicate invention benefits differentiable inductive synthesis through gradient descent.
arXiv Detail & Related papers (2022-08-13T13:46:13Z) - Rethinking Reinforcement Learning based Logic Synthesis [13.18408482571087]
We develop a new RL-based method that can automatically recognize critical operators and generate common operator sequences generalizable to unseen circuits.
Our algorithm is verified on both the EPFL benchmark, a private dataset and a circuit at industrial scale.
arXiv Detail & Related papers (2022-05-16T12:15:32Z) - Neuro-Symbolic Inductive Logic Programming with Logical Neural Networks [65.23508422635862]
We propose learning rules with the recently proposed logical neural networks (LNN)
Compared to others, LNNs offer strong connection to classical Boolean logic.
Our experiments on standard benchmarking tasks confirm that LNN rules are highly interpretable.
arXiv Detail & Related papers (2021-12-06T19:38:30Z) - Statistically Meaningful Approximation: a Case Study on Approximating
Turing Machines with Transformers [50.85524803885483]
This work proposes a formal definition of statistically meaningful (SM) approximation which requires the approximating network to exhibit good statistical learnability.
We study SM approximation for two function classes: circuits and Turing machines.
arXiv Detail & Related papers (2021-07-28T04:28:55Z) - Reinforcement Learning with External Knowledge by using Logical Neural
Networks [67.46162586940905]
A recent neuro-symbolic framework called the Logical Neural Networks (LNNs) can simultaneously provide key-properties of both neural networks and symbolic logic.
We propose an integrated method that enables model-free reinforcement learning from external knowledge sources.
arXiv Detail & Related papers (2021-03-03T12:34:59Z) - Activation Relaxation: A Local Dynamical Approximation to
Backpropagation in the Brain [62.997667081978825]
Activation Relaxation (AR) is motivated by constructing the backpropagation gradient as the equilibrium point of a dynamical system.
Our algorithm converges rapidly and robustly to the correct backpropagation gradients, requires only a single type of computational unit, and can operate on arbitrary computation graphs.
arXiv Detail & Related papers (2020-09-11T11:56:34Z) - Creating Synthetic Datasets via Evolution for Neural Program Synthesis [77.34726150561087]
We show that some program synthesis approaches generalize poorly to data distributions different from that of the randomly generated examples.
We propose a new, adversarial approach to control the bias of synthetic data distributions and show that it outperforms current approaches.
arXiv Detail & Related papers (2020-03-23T18:34:15Z) - Formal Synthesis of Lyapunov Neural Networks [61.79595926825511]
We propose an automatic and formally sound method for synthesising Lyapunov functions.
We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks.
Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees.
arXiv Detail & Related papers (2020-03-19T17:21:02Z) - CounterExample Guided Neural Synthesis [12.742347465894586]
Program synthesis is difficult and methods that provide formal guarantees suffer from scalability issues.
We combine neural networks with formal reasoning to convert a logical specification into a sequence of examples that guides the neural network towards a correct solution.
Our results show that the formal reasoning based guidance improves the performance of the neural network substantially.
arXiv Detail & Related papers (2020-01-25T01:11:53Z) - Towards Neural-Guided Program Synthesis for Linear Temporal Logic
Specifications [26.547133495699093]
We use a neural network to learn a Q-function that is then used to guide search, and to construct programs that are subsequently verified for correctness.
Our method is unique in combining search with deep learning to realize synthesis.
arXiv Detail & Related papers (2019-12-31T17:09:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.