Sequential composition of propositional logic programs
- URL: http://arxiv.org/abs/2009.05774v7
- Date: Wed, 11 Oct 2023 15:00:01 GMT
- Title: Sequential composition of propositional logic programs
- Authors: Christian Antic
- Abstract summary: We show that acyclic programs can be decomposed into single-rule programs and provide a general decomposition result for arbitrary programs.
We show that the immediate consequence operator of a program can be represented via composition which allows us to compute its least model without any explicit reference to operators.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces and studies the sequential composition and
decomposition of propositional logic programs. We show that acyclic programs
can be decomposed into single-rule programs and provide a general decomposition
result for arbitrary programs. We show that the immediate consequence operator
of a program can be represented via composition which allows us to compute its
least model without any explicit reference to operators. This bridges the
conceptual gap between the syntax and semantics of a propositional logic
program in a mathematically satisfactory way.
Related papers
- Sequential decomposition of propositional logic programs [0.0]
This paper studies the sequential decomposition of programs by studying Green's relations between programs.
In a broader sense, this paper is a further step towards an algebraic theory of logic programming.
arXiv Detail & Related papers (2023-02-21T16:14:57Z) - ProTo: Program-Guided Transformer for Program-Guided Tasks [59.34258016795216]
We formulate program-guided tasks which require learning to execute a given program on the observed task specification.
We propose the Program-guided Transformer (ProTo), which integrates both semantic and structural guidance of a program.
ProTo executes a program in a learned latent space and enjoys stronger representation ability than previous neural-symbolic approaches.
arXiv Detail & Related papers (2021-10-02T13:46:32Z) - Enforcing Consistency in Weakly Supervised Semantic Parsing [68.2211621631765]
We explore the use of consistency between the output programs for related inputs to reduce the impact of spurious programs.
We find that a more consistent formalism leads to improved model performance even without consistency-based training.
arXiv Detail & Related papers (2021-07-13T03:48:04Z) - Latent Execution for Neural Program Synthesis Beyond Domain-Specific
Languages [97.58968222942173]
We take the first step to synthesize C programs from input-output examples.
In particular, we propose La Synth, which learns the latent representation to approximate the execution of partially generated programs.
We show that training on these synthesized programs further improves the prediction performance for both Karel and C program synthesis.
arXiv Detail & Related papers (2021-06-29T02:21:32Z) - Sequential composition of answer set programs [0.0]
This paper contributes to the mathematical foundations of logic programming by introducing and studying the sequential composition of answer set programs.
In a broader sense, this paper is a first step towards an algebra of answer set programs and in the future we plan to lift the methods of this paper to wider classes of programs.
arXiv Detail & Related papers (2021-04-25T13:27:22Z) - Differentiable Inductive Logic Programming for Structured Examples [6.8774606688738995]
We propose a new framework to learn logic programs from noisy and structured examples.
We show that our new framework can learn logic programs from noisy and structured examples, such as sequences or trees.
Our framework can be scaled to deal with complex programs that consist of several clauses with function symbols.
arXiv Detail & Related papers (2021-03-02T13:47:33Z) - Representing Partial Programs with Blended Abstract Semantics [62.20775388513027]
We introduce a technique for representing partially written programs in a program synthesis engine.
We learn an approximate execution model implemented as a modular neural network.
We show that these hybrid neuro-symbolic representations enable execution-guided synthesizers to use more powerful language constructs.
arXiv Detail & Related papers (2020-12-23T20:40:18Z) - Latent Programmer: Discrete Latent Codes for Program Synthesis [56.37993487589351]
In many sequence learning tasks, such as program synthesis and document summarization, a key problem is searching over a large space of possible output sequences.
We propose to learn representations of the outputs that are specifically meant for search: rich enough to specify the desired output but compact enough to make search more efficient.
We introduce the emphLatent Programmer, a program synthesis method that first predicts a discrete latent code from input/output examples, and then generates the program in the target language.
arXiv Detail & Related papers (2020-12-01T10:11:35Z) - Program Synthesis with Pragmatic Communication [28.24612900419843]
This work introduces a new inductive bias derived by modeling the program synthesis task as rational communication.
A user study finds that end-user participants communicate more effectively with the pragmatic program synthesizer over a non-pragmatic one.
arXiv Detail & Related papers (2020-07-09T20:55:44Z) - LogicalFactChecker: Leveraging Logical Operations for Fact Checking with
Graph Module Network [111.24773949467567]
We propose LogicalFactChecker, a neural network approach capable of leveraging logical operations for fact checking.
It achieves the state-of-the-art performance on TABFACT, a large-scale, benchmark dataset.
arXiv Detail & Related papers (2020-04-28T17:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.