Hybrid Probabilistic Logic Programming: Inference and Learning
- URL: http://arxiv.org/abs/2302.00496v2
- Date: Thu, 2 Feb 2023 12:10:51 GMT
- Title: Hybrid Probabilistic Logic Programming: Inference and Learning
- Authors: Nitesh Kumar
- Abstract summary: This thesis focuses on advancing probabilistic logic programming (PLP), which combines probability theory for uncertainty and logic programming for relations.
The first contribution is the introduction of context-specific likelihood weighting (CS-LW), a new sampling algorithm that exploits context-specific independencies for computational gains.
Next, a new hybrid PLP, DC#, is introduced, which integrates the syntax of Distributional Clauses with Bayesian logic programs and represents three types of independencies.
The scalable inference algorithm FO-CS-LW is introduced for DC#.
- Score: 1.14219428942199
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This thesis focuses on advancing probabilistic logic programming (PLP), which
combines probability theory for uncertainty and logic programming for
relations. The thesis aims to extend PLP to support both discrete and
continuous random variables, which is necessary for applications with numeric
data. The first contribution is the introduction of context-specific likelihood
weighting (CS-LW), a new sampling algorithm that exploits context-specific
independencies for computational gains. Next, a new hybrid PLP, DC#, is
introduced, which integrates the syntax of Distributional Clauses with Bayesian
logic programs and represents three types of independencies: i) conditional
independencies (CIs) modeled in Bayesian networks; ii) context-specific
independencies (CSIs) represented by logical rules, and iii) independencies
amongst attributes of related objects in relational models expressed by
combining rules. The scalable inference algorithm FO-CS-LW is introduced for
DC#. Finally, the thesis addresses the lack of approaches for learning hybrid
PLP from relational data with missing values and (probabilistic) background
knowledge with the introduction of DiceML, which learns the structure and
parameters of hybrid PLP and tackles the relational autocompletion problem. The
conclusion discusses future directions and open challenges for hybrid PLP.
Related papers
- Symbolic Parameter Learning in Probabilistic Answer Set Programming [0.16385815610837165]
We propose two algorithms to solve the formalism of Proabilistic Set Programming.
The first solves the task using an off-the-shelf constrained optimization solver.
The second is based on an implementation of the Expectation Maximization algorithm.
arXiv Detail & Related papers (2024-08-16T13:32:47Z) - Semirings for Probabilistic and Neuro-Symbolic Logic Programming [15.747744148181829]
We show that many extensions of probabilistic logic programming can be cast within a common algebraic logic programming framework.
This does not only hold for the PLP variations itself but also for the underlying execution mechanism that is based on (algebraic) model counting.
arXiv Detail & Related papers (2024-02-21T13:06:52Z) - Sample Complexity Characterization for Linear Contextual MDPs [67.79455646673762]
Contextual decision processes (CMDPs) describe a class of reinforcement learning problems in which the transition kernels and reward functions can change over time with different MDPs indexed by a context variable.
CMDPs serve as an important framework to model many real-world applications with time-varying environments.
We study CMDPs under two linear function approximation models: Model I with context-varying representations and common linear weights for all contexts; and Model II with common representations for all contexts and context-varying linear weights.
arXiv Detail & Related papers (2024-02-05T03:25:04Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Declarative Probabilistic Logic Programming in Discrete-Continuous Domains [16.153683223016973]
We contribute the measure semantics together with the hybrid PLP language DC-ProbLog and its inference engine infinitesimal algebraic likelihood weighting (IALW)
We generalize the state of the art of PLP towards hybrid PLP in three different aspects: semantics, language and inference.
IALW is the first inference algorithm for hybrid probabilistic programming based on knowledge compilation.
arXiv Detail & Related papers (2023-02-21T13:50:38Z) - Towards Realistic Low-resource Relation Extraction: A Benchmark with
Empirical Baseline Study [51.33182775762785]
This paper presents an empirical study to build relation extraction systems in low-resource settings.
We investigate three schemes to evaluate the performance in low-resource settings: (i) different types of prompt-based methods with few-shot labeled data; (ii) diverse balancing methods to address the long-tailed distribution issue; and (iii) data augmentation technologies and self-training to generate more labeled in-domain data.
arXiv Detail & Related papers (2022-10-19T15:46:37Z) - First-Order Context-Specific Likelihood Weighting in Hybrid
Probabilistic Logic Programs [24.503581751619787]
Three types of independencies are important to represent and exploit for scalable inference in hybrid models.
This paper introduces a hybrid probabilistic logic programming language, DC#, which integrates distributional clauses' syntax and semantics principles of Bayesian logic programs.
We also introduce the scalable inference algorithm FO-CS-LW for DC#.
arXiv Detail & Related papers (2022-01-26T20:06:02Z) - Structural Learning of Probabilistic Sentential Decision Diagrams under
Partial Closed-World Assumption [127.439030701253]
Probabilistic sentential decision diagrams are a class of structured-decomposable circuits.
We propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit.
Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying logical base.
arXiv Detail & Related papers (2021-07-26T12:01:56Z) - Online Learning Probabilistic Event Calculus Theories in Answer Set
Programming [70.06301658267125]
Event Recognition (CER) systems detect occurrences in streaming time-stamped datasets using predefined event patterns.
We present a system based on Answer Set Programming (ASP), capable of probabilistic reasoning with complex event patterns in the form of rules weighted in the Event Calculus.
Our results demonstrate the superiority of our novel approach, both terms efficiency and predictive.
arXiv Detail & Related papers (2021-03-31T23:16:29Z) - Conditional independence by typing [30.194205448457385]
A central goal of probabilistic programming languages (PPLs) is to separate modelling from inference.
Conditional independence (CI) relationships among parameters are a crucial aspect of probabilistic models.
We show that for a well-typed program in our system, the distribution it implements is guaranteed to have certain CI-relationships.
arXiv Detail & Related papers (2020-10-22T17:27:22Z) - Implicit Distributional Reinforcement Learning [61.166030238490634]
implicit distributional actor-critic (IDAC) built on two deep generator networks (DGNs)
Semi-implicit actor (SIA) powered by a flexible policy distribution.
We observe IDAC outperforms state-of-the-art algorithms on representative OpenAI Gym environments.
arXiv Detail & Related papers (2020-07-13T02:52:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.