A Normal Form Characterization for Efficient Boolean Skolem Function
Synthesis
- URL: http://arxiv.org/abs/2104.14098v1
- Date: Thu, 29 Apr 2021 04:16:41 GMT
- Title: A Normal Form Characterization for Efficient Boolean Skolem Function
Synthesis
- Authors: Preey Shah, Aman Bansal, S. Akshay and Supratik Chakraborty
- Abstract summary: We present a form of representation called SAUNF that characterizes a normal synthesis in a tractable way.
It is exponentially more succinct than well-established normal forms like BDDs and DNNFs, used in the context of AI problems, and strictly subsumes other more recently proposed forms like SynNNF.
- Score: 2.093287944284448
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Boolean Skolem function synthesis concerns synthesizing outputs as Boolean
functions of inputs such that a relational specification between inputs and
outputs is satisfied. This problem, also known as Boolean functional synthesis,
has several applications, including design of safe controllers for autonomous
systems, certified QBF solving, cryptanalysis etc. Recently, complexity
theoretic hardness results have been shown for the problem, although several
algorithms proposed in the literature are known to work well in practice. This
dichotomy between theoretical hardness and practical efficacy has motivated the
research into normal forms or representations of input specifications that
permit efficient synthesis, thus explaining perhaps the efficacy of these
algorithms.
In this paper we go one step beyond this and ask if there exists a normal
form representation that can in fact precisely characterize "efficient"
synthesis. We present a normal form called SAUNF that precisely characterizes
tractable synthesis in the following sense: a specification is polynomial time
synthesizable iff it can be compiled to SAUNF in polynomial time. Additionally,
a specification admits a polynomial-sized functional solution iff there exists
a semantically equivalent polynomial-sized SAUNF representation. SAUNF is
exponentially more succinct than well-established normal forms like BDDs and
DNNFs, used in the context of AI problems, and strictly subsumes other more
recently proposed forms like SynNNF. It enjoys compositional properties that
are similar to those of DNNF. Thus, SAUNF provides the right trade-off in
knowledge representation for Boolean functional synthesis.
Related papers
- Efficient Reactive Synthesis Using Mode Decomposition [0.0]
We propose a novel decomposition algorithm based on modes.
The input to our algorithm is the original specification and the description of the modes.
We show how to generate sub-specifications automatically and we prove that if all sub-problems are realizable the full specification is realizable.
arXiv Detail & Related papers (2023-12-14T08:01:35Z) - Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples [10.482805367361818]
We propose a novel, fully explainable neural approach to synthesis of logic circuits from input-output examples.
Our method can be employed for a virtually arbitrary choice of atoms.
arXiv Detail & Related papers (2022-10-29T14:06:42Z) - Linear Temporal Logic Modulo Theories over Finite Traces (Extended
Version) [72.38188258853155]
This paper studies Linear Temporal Logic over Finite Traces (LTLf)
proposition letters are replaced with first-order formulas interpreted over arbitrary theories.
The resulting logic, called Satisfiability Modulo Theories (LTLfMT), is semi-decidable.
arXiv Detail & Related papers (2022-04-28T17:57:33Z) - Adaptive n-ary Activation Functions for Probabilistic Boolean Logic [2.294014185517203]
We show that we can learn arbitrary logic in a single layer using an activation function of matching or greater arity.
We represent belief tables using a basis that directly associates the number of nonzero parameters to the effective arity of the belief function.
This opens optimization approaches to reduce logical complexity by inducing parameter sparsity.
arXiv Detail & Related papers (2022-03-16T22:47:53Z) - Statistically Meaningful Approximation: a Case Study on Approximating
Turing Machines with Transformers [50.85524803885483]
This work proposes a formal definition of statistically meaningful (SM) approximation which requires the approximating network to exhibit good statistical learnability.
We study SM approximation for two function classes: circuits and Turing machines.
arXiv Detail & Related papers (2021-07-28T04:28:55Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - Sinkhorn Natural Gradient for Generative Models [125.89871274202439]
We propose a novel Sinkhorn Natural Gradient (SiNG) algorithm which acts as a steepest descent method on the probability space endowed with the Sinkhorn divergence.
We show that the Sinkhorn information matrix (SIM), a key component of SiNG, has an explicit expression and can be evaluated accurately in complexity that scales logarithmically.
In our experiments, we quantitatively compare SiNG with state-of-the-art SGD-type solvers on generative tasks to demonstrate its efficiency and efficacy of our method.
arXiv Detail & Related papers (2020-11-09T02:51:17Z) - Automated and Sound Synthesis of Lyapunov Functions with SMT Solvers [70.70479436076238]
We synthesise Lyapunov functions for linear, non-linear (polynomial) and parametric models.
We exploit an inductive framework to synthesise Lyapunov functions, starting from parametric templates.
arXiv Detail & Related papers (2020-07-21T14:45:23Z) - Formal Synthesis of Lyapunov Neural Networks [61.79595926825511]
We propose an automatic and formally sound method for synthesising Lyapunov functions.
We employ a counterexample-guided approach where a numerical learner and a symbolic verifier interact to construct provably correct Lyapunov neural networks.
Our method synthesises Lyapunov functions faster and over wider spatial domains than the alternatives, yet providing stronger or equal guarantees.
arXiv Detail & Related papers (2020-03-19T17:21:02Z) - Towards Neural-Guided Program Synthesis for Linear Temporal Logic
Specifications [26.547133495699093]
We use a neural network to learn a Q-function that is then used to guide search, and to construct programs that are subsequently verified for correctness.
Our method is unique in combining search with deep learning to realize synthesis.
arXiv Detail & Related papers (2019-12-31T17:09:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.