Projectivity revisited
- URL: http://arxiv.org/abs/2207.00625v4
- Date: Tue, 20 Aug 2024 12:55:45 GMT
- Title: Projectivity revisited
- Authors: Felix Weitkämper,
- Abstract summary: We extend the notion of projectivity from families of distributions indexed by domain size to functors taking extensional data from a database.
This makes projectivity available for the large range of applications taking structured input.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The behaviour of statistical relational representations across differently sized domains has become a focal area of research from both a modelling and a complexity viewpoint.Recently, projectivity of a family of distributions emerged as a key property, ensuring that marginal probabilities are independent of the domain size. However, the formalisation used currently assumes that the domain is characterised only by its size. This contribution extends the notion of projectivity from families of distributions indexed by domain size to functors taking extensional data from a database. This makes projectivity available for the large range of applications taking structured input. We transfer key known results on projective families of distributions to the new setting. This includes a characterisation of projective fragments in different statistical relational formalisms as well as a general representation theorem for projective families of distributions. Furthermore, we prove a correspondence between projectivity and distributions on countably infinite domains, which we use to unify and generalise earlier work on statistical relational representations in infinite domains. Finally, we use the extended notion of projectivity to define a further strengthening, which we call $\sigma$-projectivity, and which allows the use of the same representation in different modes while retaining projectivity.
Related papers
- Scalable Counterfactual Distribution Estimation in Multivariate Causal
Models [12.88471300865496]
We consider the problem of estimating the counterfactual joint distribution of multiple quantities of interests in a multivariate causal model.
We propose a method that alleviates both issues simultaneously by leveraging a robust latent one-dimensional subspace.
We demonstrate the advantages of our approach over existing methods on both synthetic and real-world data.
arXiv Detail & Related papers (2023-11-02T01:45:44Z) - Multi-Domain Causal Representation Learning via Weak Distributional
Invariances [27.72497122405241]
Causal representation learning has emerged as the center of action in causal machine learning research.
We show that autoencoders that incorporate such invariances can provably identify the stable set of latents from the rest across different settings.
arXiv Detail & Related papers (2023-10-04T14:41:41Z) - Flow Factorized Representation Learning [109.51947536586677]
We introduce a generative model which specifies a distinct set of latent probability paths that define different input transformations.
We show that our model achieves higher likelihoods on standard representation learning benchmarks while simultaneously being closer to approximately equivariant models.
arXiv Detail & Related papers (2023-09-22T20:15:37Z) - On Projectivity in Markov Logic Networks [7.766921168069532]
Logic Logic Networks (MLNs) define a probability distribution on structures over varying domain sizes.
Projective models potentially allow efficient and consistent parameter learning from sub-sampled domains.
arXiv Detail & Related papers (2022-04-08T11:37:53Z) - Variational Disentanglement for Domain Generalization [68.85458536180437]
We propose to tackle the problem of domain generalization by delivering an effective framework named Variational Disentanglement Network (VDN)
VDN is capable of disentangling the domain-specific features and task-specific features, where the task-specific features are expected to be better generalized to unseen but related test data.
arXiv Detail & Related papers (2021-09-13T09:55:32Z) - Interaction Models and Generalized Score Matching for Compositional Data [9.797319790710713]
We propose a class of exponential family models that accommodate general patterns of pairwise interaction while being supported on the probability simplex.
Special cases include the family of Dirichlet distributions as well as Aitchison's additive logistic normal distributions.
A high-dimensional analysis of our estimation methods shows that the simplex domain is handled as efficiently as previously studied full-dimensional domains.
arXiv Detail & Related papers (2021-09-10T05:29:41Z) - Attentional Prototype Inference for Few-Shot Segmentation [128.45753577331422]
We propose attentional prototype inference (API), a probabilistic latent variable framework for few-shot segmentation.
We define a global latent variable to represent the prototype of each object category, which we model as a probabilistic distribution.
We conduct extensive experiments on four benchmarks, where our proposal obtains at least competitive and often better performance than state-of-the-art prototype-based methods.
arXiv Detail & Related papers (2021-05-14T06:58:44Z) - A Bit More Bayesian: Domain-Invariant Learning with Uncertainty [111.22588110362705]
Domain generalization is challenging due to the domain shift and the uncertainty caused by the inaccessibility of target domain data.
In this paper, we address both challenges with a probabilistic framework based on variational Bayesian inference.
We derive domain-invariant representations and classifiers, which are jointly established in a two-layer Bayesian neural network.
arXiv Detail & Related papers (2021-05-09T21:33:27Z) - Generalization Properties of Optimal Transport GANs with Latent
Distribution Learning [52.25145141639159]
We study how the interplay between the latent distribution and the complexity of the pushforward map affects performance.
Motivated by our analysis, we advocate learning the latent distribution as well as the pushforward map within the GAN paradigm.
arXiv Detail & Related papers (2020-07-29T07:31:33Z) - A Complete Characterization of Projectivity for Statistical Relational
Models [20.833623839057097]
We introduce a class of directed latent graphical variable models that precisely correspond to the class of projective relational models.
We also obtain a characterization for when a given distribution over size-$k$ structures is the statistical frequency distribution of size-$k$ sub-structures in much larger size-$n$ structures.
arXiv Detail & Related papers (2020-04-23T05:58:27Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.