Domain-Liftability of Relational Marginal Polytopes
- URL: http://arxiv.org/abs/2001.05198v1
- Date: Wed, 15 Jan 2020 09:45:48 GMT
- Title: Domain-Liftability of Relational Marginal Polytopes
- Authors: Ondrej Kuzelka, Yuyi Wang
- Abstract summary: We study computational aspects of relational marginal polytopes.
In particular, we show that weight learning of MLNs is domain-liftable whenever the computation of the partition function of the respective MLNs is domain-liftable.
- Score: 18.320433800967
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study computational aspects of relational marginal polytopes which are
statistical relational learning counterparts of marginal polytopes, well-known
from probabilistic graphical models. Here, given some first-order logic
formula, we can define its relational marginal statistic to be the fraction of
groundings that make this formula true in a given possible world. For a list of
first-order logic formulas, the relational marginal polytope is the set of all
points that correspond to the expected values of the relational marginal
statistics that are realizable. In this paper, we study the following two
problems: (i) Do domain-liftability results for the partition functions of
Markov logic networks (MLNs) carry over to the problem of relational marginal
polytope construction? (ii) Is the relational marginal polytope containment
problem hard under some plausible complexity-theoretic assumptions? Our
positive results have consequences for lifted weight learning of MLNs. In
particular, we show that weight learning of MLNs is domain-liftable whenever
the computation of the partition function of the respective MLNs is
domain-liftable (this result has not been rigorously proven before).
Related papers
- The polygon relation and subadditivity of entropic measures for discrete and continuous multipartite entanglement [0.6759148939470331]
We study the relationship between the polygon relation and the subadditivity of entropy.
Our work provides a better understanding of the rich structure of multipartite states.
arXiv Detail & Related papers (2024-01-04T05:09:37Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Data-Driven Influence Functions for Optimization-Based Causal Inference [105.5385525290466]
We study a constructive algorithm that approximates Gateaux derivatives for statistical functionals by finite differencing.
We study the case where probability distributions are not known a priori but need to be estimated from data.
arXiv Detail & Related papers (2022-08-29T16:16:22Z) - On Projectivity in Markov Logic Networks [7.766921168069532]
Logic Logic Networks (MLNs) define a probability distribution on structures over varying domain sizes.
Projective models potentially allow efficient and consistent parameter learning from sub-sampled domains.
arXiv Detail & Related papers (2022-04-08T11:37:53Z) - Distribution Regression with Sliced Wasserstein Kernels [45.916342378789174]
We propose the first OT-based estimator for distribution regression.
We study the theoretical properties of a kernel ridge regression estimator based on such representation.
arXiv Detail & Related papers (2022-02-08T15:21:56Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - Recovery of Joint Probability Distribution from one-way marginals: Low
rank Tensors and Random Projections [2.9929093132587763]
Joint probability mass function (PMF) estimation is a fundamental machine learning problem.
In this work, we link random projections of data to the problem of PMF estimation using ideas from tomography.
We provide a novel algorithm for recovering factors of the tensor from one-way marginals, test it across a variety of synthetic and real-world datasets, and also perform MAP inference on the estimated model for classification.
arXiv Detail & Related papers (2021-03-22T14:00:57Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Finite-Function-Encoding Quantum States [52.77024349608834]
We introduce finite-function-encoding (FFE) states which encode arbitrary $d$-valued logic functions.
We investigate some of their structural properties.
arXiv Detail & Related papers (2020-12-01T13:53:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.