Declarative Probabilistic Logic Programming in Discrete-Continuous Domains
- URL: http://arxiv.org/abs/2302.10674v2
- Date: Sun, 8 Sep 2024 16:20:03 GMT
- Title: Declarative Probabilistic Logic Programming in Discrete-Continuous Domains
- Authors: Pedro Zuidberg Dos Martires, Luc De Raedt, Angelika Kimmig,
- Abstract summary: We contribute the measure semantics together with the hybrid PLP language DC-ProbLog and its inference engine infinitesimal algebraic likelihood weighting (IALW)
We generalize the state of the art of PLP towards hybrid PLP in three different aspects: semantics, language and inference.
IALW is the first inference algorithm for hybrid probabilistic programming based on knowledge compilation.
- Score: 16.153683223016973
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Over the past three decades, the logic programming paradigm has been successfully expanded to support probabilistic modeling, inference and learning. The resulting paradigm of probabilistic logic programming (PLP) and its programming languages owes much of its success to a declarative semantics, the so-called distribution semantics. However, the distribution semantics is limited to discrete random variables only. While PLP has been extended in various ways for supporting hybrid, that is, mixed discrete and continuous random variables, we are still lacking a declarative semantics for hybrid PLP that not only generalizes the distribution semantics and the modeling language but also the standard inference algorithm that is based on knowledge compilation. We contribute the measure semantics together with the hybrid PLP language DC-ProbLog (where DC stands for distributional clauses) and its inference engine infinitesimal algebraic likelihood weighting (IALW). These have the original distribution semantics, standard PLP languages such as ProbLog, and standard inference engines for PLP based on knowledge compilation as special cases. Thus, we generalize the state of the art of PLP towards hybrid PLP in three different aspects: semantics, language and inference. Furthermore, IALW is the first inference algorithm for hybrid probabilistic programming based on knowledge compilation
Related papers
- Semirings for Probabilistic and Neuro-Symbolic Logic Programming [15.747744148181829]
We show that many extensions of probabilistic logic programming can be cast within a common algebraic logic programming framework.
This does not only hold for the PLP variations itself but also for the underlying execution mechanism that is based on (algebraic) model counting.
arXiv Detail & Related papers (2024-02-21T13:06:52Z) - smProbLog: Stable Model Semantics in ProbLog for Probabilistic
Argumentation [19.46250467634934]
We show that the programs representing probabilistic argumentation frameworks do not satisfy a common assumption in probabilistic logic programming (PLP) semantics.
The second contribution is then a novel PLP semantics for programs where a choice of probabilistic facts does not uniquely determine the truth assignment of the logical atoms.
The third contribution is the implementation of a PLP system supporting this semantics: smProbLog.
arXiv Detail & Related papers (2023-04-03T10:59:25Z) - $\omega$PAP Spaces: Reasoning Denotationally About Higher-Order,
Recursive Probabilistic and Differentiable Programs [64.25762042361839]
$omega$PAP spaces are spaces for reasoning denotationally about expressive differentiable and probabilistic programming languages.
Our semantics is general enough to assign meanings to most practical probabilistic and differentiable programs.
We establish the almost-everywhere differentiability of probabilistic programs' trace density functions.
arXiv Detail & Related papers (2023-02-21T12:50:05Z) - Hybrid Probabilistic Logic Programming: Inference and Learning [1.14219428942199]
This thesis focuses on advancing probabilistic logic programming (PLP), which combines probability theory for uncertainty and logic programming for relations.
The first contribution is the introduction of context-specific likelihood weighting (CS-LW), a new sampling algorithm that exploits context-specific independencies for computational gains.
Next, a new hybrid PLP, DC#, is introduced, which integrates the syntax of Distributional Clauses with Bayesian logic programs and represents three types of independencies.
The scalable inference algorithm FO-CS-LW is introduced for DC#.
arXiv Detail & Related papers (2023-02-01T15:07:36Z) - Machine Learning with Probabilistic Law Discovery: A Concise
Introduction [77.34726150561087]
Probabilistic Law Discovery (PLD) is a logic based Machine Learning method, which implements a variant of probabilistic rule learning.
PLD is close to Decision Tree/Random Forest methods, but it differs significantly in how relevant rules are defined.
This paper outlines the main principles of PLD, highlight its benefits and limitations and provide some application guidelines.
arXiv Detail & Related papers (2022-12-22T17:40:13Z) - Differentially-Private Bayes Consistency [70.92545332158217]
We construct a Bayes consistent learning rule that satisfies differential privacy (DP)
We prove that any VC class can be privately learned in a semi-supervised setting with a near-optimal sample complexity.
arXiv Detail & Related papers (2022-12-08T11:57:30Z) - Learning versus Refutation in Noninteractive Local Differential Privacy [133.80204506727526]
We study two basic statistical tasks in non-interactive local differential privacy (LDP): learning and refutation.
Our main result is a complete characterization of the sample complexity of PAC learning for non-interactive LDP protocols.
arXiv Detail & Related papers (2022-10-26T03:19:24Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - plingo: A system for probabilistic reasoning in clingo based on lpmln [2.7742922296398738]
We present plingo, an extension of the ASP system clingo with various probabilistic reasoning modes.
Plingo is centered upon LPMLN, a probabilistic extension of ASP based on a weight scheme from Markov Logic.
We evaluate plingo's performance empirically by comparing it to other probabilistic systems.
arXiv Detail & Related papers (2022-06-23T07:51:10Z) - First-Order Context-Specific Likelihood Weighting in Hybrid
Probabilistic Logic Programs [24.503581751619787]
Three types of independencies are important to represent and exploit for scalable inference in hybrid models.
This paper introduces a hybrid probabilistic logic programming language, DC#, which integrates distributional clauses' syntax and semantics principles of Bayesian logic programs.
We also introduce the scalable inference algorithm FO-CS-LW for DC#.
arXiv Detail & Related papers (2022-01-26T20:06:02Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.