Probabilistic Answer Set Programming with Discrete and Continuous Random Variables
- URL: http://arxiv.org/abs/2409.20274v1
- Date: Mon, 30 Sep 2024 13:24:42 GMT
- Title: Probabilistic Answer Set Programming with Discrete and Continuous Random Variables
- Authors: Damiano Azzolini, Fabrizio Riguzzi,
- Abstract summary: Probabilistic Answer Set Programming (PASP) extends Answer Set Programming with probabilistic facts that represent uncertain information.
We propose Hybrid Probabilistic Answer Set Programming (HPASP)
We discuss, implement, and assess the performance of two exact algorithms based on projected answer set enumeration and knowledge compilation.
- Score: 0.18416014644193066
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic Answer Set Programming under the credal semantics (PASP) extends Answer Set Programming with probabilistic facts that represent uncertain information. The probabilistic facts are discrete with Bernoulli distributions. However, several real-world scenarios require a combination of both discrete and continuous random variables. In this paper, we extend the PASP framework to support continuous random variables and propose Hybrid Probabilistic Answer Set Programming (HPASP). Moreover, we discuss, implement, and assess the performance of two exact algorithms based on projected answer set enumeration and knowledge compilation and two approximate algorithms based on sampling. Empirical results, also in line with known theoretical results, show that exact inference is feasible only for small instances, but knowledge compilation has a huge positive impact on the performance. Sampling allows handling larger instances, but sometimes requires an increasing amount of memory. Under consideration in Theory and Practice of Logic Programming (TPLP).
Related papers
- Symbolic Parameter Learning in Probabilistic Answer Set Programming [0.16385815610837165]
We propose two algorithms to solve the formalism of Proabilistic Set Programming.
The first solves the task using an off-the-shelf constrained optimization solver.
The second is based on an implementation of the Expectation Maximization algorithm.
arXiv Detail & Related papers (2024-08-16T13:32:47Z) - Batch Bayesian Optimization for Replicable Experimental Design [56.64902148159355]
Many real-world design problems evaluate multiple experimental conditions in parallel and replicate each condition multiple times due to large and heteroscedastic observation noise.
We propose the Batch Thompson Sampling for Replicable Experimental Design framework, which encompasses three algorithms.
We show the effectiveness of our algorithms in two practical real-world applications: precision agriculture and AutoML.
arXiv Detail & Related papers (2023-11-02T12:46:03Z) - smProbLog: Stable Model Semantics in ProbLog for Probabilistic
Argumentation [19.46250467634934]
We show that the programs representing probabilistic argumentation frameworks do not satisfy a common assumption in probabilistic logic programming (PLP) semantics.
The second contribution is then a novel PLP semantics for programs where a choice of probabilistic facts does not uniquely determine the truth assignment of the logical atoms.
The third contribution is the implementation of a PLP system supporting this semantics: smProbLog.
arXiv Detail & Related papers (2023-04-03T10:59:25Z) - $\omega$PAP Spaces: Reasoning Denotationally About Higher-Order,
Recursive Probabilistic and Differentiable Programs [64.25762042361839]
$omega$PAP spaces are spaces for reasoning denotationally about expressive differentiable and probabilistic programming languages.
Our semantics is general enough to assign meanings to most practical probabilistic and differentiable programs.
We establish the almost-everywhere differentiability of probabilistic programs' trace density functions.
arXiv Detail & Related papers (2023-02-21T12:50:05Z) - SMProbLog: Stable Model Semantics in ProbLog and its Applications in
Argumentation [17.71804768917815]
SMProbLog is a generalization of the probabilistic logic programming language ProbLog.
We show how SMProbLog can be used to reason about probabilistic argumentation problems.
arXiv Detail & Related papers (2021-10-05T12:29:22Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - pRSL: Interpretable Multi-label Stacking by Learning Probabilistic Rules [0.0]
We present the probabilistic rule stacking (pRSL) which uses probabilistic propositional logic rules and belief propagation to combine the predictions of several underlying classifiers.
We derive algorithms for exact and approximate inference and learning, and show that pRSL reaches state-of-the-art performance on various benchmark datasets.
arXiv Detail & Related papers (2021-05-28T14:06:21Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z) - Tractable Inference in Credal Sentential Decision Diagrams [116.6516175350871]
Probabilistic sentential decision diagrams are logic circuits where the inputs of disjunctive gates are annotated by probability values.
We develop the credal sentential decision diagrams, a generalisation of their probabilistic counterpart that allows for replacing the local probabilities with credal sets of mass functions.
For a first empirical validation, we consider a simple application based on noisy seven-segment display images.
arXiv Detail & Related papers (2020-08-19T16:04:34Z) - MAP Inference for Probabilistic Logic Programming [0.30586855806896046]
We consider the Maximum-A-Posteriori (MAP) inference task and the Most Probable Explanation (MPE) task.
We present a novel algorithm, included in the PITA reasoner, which tackles these tasks by representing each problem as a Binary Decision Diagram.
We compare our algorithm with the version of ProbLog that admits annotated disjunctions and can perform MAP and MPE inference.
arXiv Detail & Related papers (2020-08-04T08:10:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.