Structural Learning of Probabilistic Sentential Decision Diagrams under
Partial Closed-World Assumption
- URL: http://arxiv.org/abs/2107.12130v1
- Date: Mon, 26 Jul 2021 12:01:56 GMT
- Title: Structural Learning of Probabilistic Sentential Decision Diagrams under
Partial Closed-World Assumption
- Authors: Alessandro Antonucci and Alessandro Facchini and Lilith Mattei
- Abstract summary: Probabilistic sentential decision diagrams are a class of structured-decomposable circuits.
We propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit.
Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying logical base.
- Score: 127.439030701253
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Probabilistic sentential decision diagrams are a class of
structured-decomposable probabilistic circuits especially designed to embed
logical constraints. To adapt the classical LearnSPN scheme to learn the
structure of these models, we propose a new scheme based on a partial
closed-world assumption: data implicitly provide the logical base of the
circuit. Sum nodes are thus learned by recursively clustering batches in the
initial data base, while the partitioning of the variables obeys a given input
vtree. Preliminary experiments show that the proposed approach might properly
fit training data, and generalize well to test data, provided that these remain
consistent with the underlying logical base, that is a relaxation of the
training data base.
Related papers
- Learning Prescriptive ReLU Networks [3.092691764363848]
We study the problem of learning optimal policy from a set of discrete treatment options using observational data.
We propose a piecewise linear neural network model that can balance strong prescriptive performance and interpretability.
arXiv Detail & Related papers (2023-06-01T13:17:29Z) - Compositional Probabilistic and Causal Inference using Tractable Circuit
Models [20.07977560803858]
We introduce md-vtrees, a novel structural formulation of (marginal) determinism in structured decomposable PCs.
We derive the first polytime algorithms for causal inference queries such as backdoor adjustment on PCs.
arXiv Detail & Related papers (2023-04-17T13:48:16Z) - A Recursively Recurrent Neural Network (R2N2) Architecture for Learning
Iterative Algorithms [64.3064050603721]
We generalize Runge-Kutta neural network to a recurrent neural network (R2N2) superstructure for the design of customized iterative algorithms.
We demonstrate that regular training of the weight parameters inside the proposed superstructure on input/output data of various computational problem classes yields similar iterations to Krylov solvers for linear equation systems, Newton-Krylov solvers for nonlinear equation systems, and Runge-Kutta solvers for ordinary differential equations.
arXiv Detail & Related papers (2022-11-22T16:30:33Z) - Semantic Probabilistic Layers for Neuro-Symbolic Learning [83.25785999205932]
We design a predictive layer for structured-output prediction (SOP)
It can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic constraints.
Our Semantic Probabilistic Layer (SPL) can model intricate correlations, and hard constraints, over a structured output space.
arXiv Detail & Related papers (2022-06-01T12:02:38Z) - FF-NSL: Feed-Forward Neural-Symbolic Learner [70.978007919101]
This paper introduces a neural-symbolic learning framework, called Feed-Forward Neural-Symbolic Learner (FF-NSL)
FF-NSL integrates state-of-the-art ILP systems based on the Answer Set semantics, with neural networks, in order to learn interpretable hypotheses from labelled unstructured data.
arXiv Detail & Related papers (2021-06-24T15:38:34Z) - Parsimonious Inference [0.0]
Parsimonious inference is an information-theoretic formulation of inference over arbitrary architectures.
Our approaches combine efficient encodings with prudent sampling strategies to construct predictive ensembles without cross-validation.
arXiv Detail & Related papers (2021-03-03T04:13:14Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z) - Symbolic Querying of Vector Spaces: Probabilistic Databases Meets
Relational Embeddings [35.877591735510734]
We formalize a probabilistic database model with respect to which all queries are done.
The lack of a well-defined joint probability distribution causes simple query problems to become provably hard.
We introduce TO, a relational embedding model designed to be a tractable probabilistic database.
arXiv Detail & Related papers (2020-02-24T01:17:25Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z) - Bayesian stochastic blockmodeling [0.0]
This chapter provides a self-contained introduction to the use of Bayesian inference to extract large-scale modular structures from network data.
We focus on nonparametric formulations that allow their inference in a manner that prevents overfitting, and enables model selection.
We show how inferring the blockmodel can be used to predict missing and spurious links, and shed light on the fundamental limitations of the detectability of modular structures in networks.
arXiv Detail & Related papers (2017-05-29T14:53:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.