A Complete Characterization of Projectivity for Statistical Relational
Models
- URL: http://arxiv.org/abs/2004.10984v2
- Date: Mon, 22 Jun 2020 11:44:16 GMT
- Title: A Complete Characterization of Projectivity for Statistical Relational
Models
- Authors: Manfred Jaeger and Oliver Schulte
- Abstract summary: We introduce a class of directed latent graphical variable models that precisely correspond to the class of projective relational models.
We also obtain a characterization for when a given distribution over size-$k$ structures is the statistical frequency distribution of size-$k$ sub-structures in much larger size-$n$ structures.
- Score: 20.833623839057097
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: A generative probabilistic model for relational data consists of a family of
probability distributions for relational structures over domains of different
sizes. In most existing statistical relational learning (SRL) frameworks, these
models are not projective in the sense that the marginal of the distribution
for size-$n$ structures on induced sub-structures of size $k<n$ is equal to the
given distribution for size-$k$ structures. Projectivity is very beneficial in
that it directly enables lifted inference and statistically consistent learning
from sub-sampled relational structures. In earlier work some simple fragments
of SRL languages have been identified that represent projective models.
However, no complete characterization of, and representation framework for
projective models has been given. In this paper we fill this gap: exploiting
representation theorems for infinite exchangeable arrays we introduce a class
of directed graphical latent variable models that precisely correspond to the
class of projective relational models. As a by-product we also obtain a
characterization for when a given distribution over size-$k$ structures is the
statistical frequency distribution of size-$k$ sub-structures in much larger
size-$n$ structures. These results shed new light onto the old open problem of
how to apply Halpern et al.'s "random worlds approach" for probabilistic
inference to general relational signatures.
Related papers
- Model-free Estimation of Latent Structure via Multiscale Nonparametric Maximum Likelihood [13.175343048302697]
We propose a model-free approach for estimating such latent structures whenever they are present, without assuming they exist a priori.
As an application, we design a clustering algorithm based on the proposed procedure and demonstrate its effectiveness in capturing a wide range of latent structures.
arXiv Detail & Related papers (2024-10-29T17:11:33Z) - Adapting to Unknown Low-Dimensional Structures in Score-Based Diffusion Models [6.76974373198208]
We find that the dependency of the error incurred within each denoising step on the ambient dimension $d$ is in general unavoidable.
This represents the first theoretical demonstration that the DDPM sampler can adapt to unknown low-dimensional structures in the target distribution.
arXiv Detail & Related papers (2024-05-23T17:59:10Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Cyclic Directed Probabilistic Graphical Model: A Proposal Based on
Structured Outcomes [0.0]
We describe a probabilistic graphical model - probabilistic relation network - that allows the direct capture of directional cyclic dependencies.
This model does not violate the probability axioms, and it supports learning from observed data.
Notably, it supports probabilistic inference, making it a prospective tool in data analysis and in expert and design-making applications.
arXiv Detail & Related papers (2023-10-25T10:19:03Z) - Projectivity revisited [0.0]
We extend the notion of projectivity from families of distributions indexed by domain size to functors taking extensional data from a database.
This makes projectivity available for the large range of applications taking structured input.
arXiv Detail & Related papers (2022-07-01T18:54:36Z) - Low-Rank Constraints for Fast Inference in Structured Models [110.38427965904266]
This work demonstrates a simple approach to reduce the computational and memory complexity of a large class of structured models.
Experiments with neural parameterized structured models for language modeling, polyphonic music modeling, unsupervised grammar induction, and video modeling show that our approach matches the accuracy of standard models at large state spaces.
arXiv Detail & Related papers (2022-01-08T00:47:50Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Generalization Properties of Optimal Transport GANs with Latent
Distribution Learning [52.25145141639159]
We study how the interplay between the latent distribution and the complexity of the pushforward map affects performance.
Motivated by our analysis, we advocate learning the latent distribution as well as the pushforward map within the GAN paradigm.
arXiv Detail & Related papers (2020-07-29T07:31:33Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.