Projectivity revisited
- URL: http://arxiv.org/abs/2207.00625v1
- Date: Fri, 1 Jul 2022 18:54:36 GMT
- Title: Projectivity revisited
- Authors: Felix Weitk\"amper
- Abstract summary: In 2018, Jaeger and Schulte suggested projectivity of a family of distributions as a key property.
This contribution extends the notion of projectivity from families of distributions indexed by domain size to functors taking extensional data from a database.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The behaviour of statistical relational representations across differently
sized domains has become a focal area of research from both a modelling and a
complexity viewpoint. In 2018, Jaeger and Schulte suggested projectivity of a
family of distributions as a key property, ensuring that marginal inference is
independent of the domain size. However, Jaeger and Schulte assume that the
domain is characterised only by its size. This contribution extends the notion
of projectivity from families of distributions indexed by domain size to
functors taking extensional data from a database. This makes projectivity
available for the large range of applications taking structured input. We
transfer the known attractive properties of projective families of
distributions to the new setting. Furthermore, we prove a correspondence
between projectivity and distributions on countably infinite domains, which we
use to unify and generalise earlier work on statistical relational
representations in infinite domains. Finally, we use the extended notion of
projectivity to define a further strengthening, which we call
$\sigma$-projectivity, and which allows the use of the same representation in
different modes while retaining projectivity.
Related papers
- Measuring Orthogonality in Representations of Generative Models [81.13466637365553]
In unsupervised representation learning, models aim to distill essential features from high-dimensional data into lower-dimensional learned representations.
Disentanglement of independent generative processes has long been credited with producing high-quality representations.
We propose two novel metrics: Importance-Weighted Orthogonality (IWO) and Importance-Weighted Rank (IWR)
arXiv Detail & Related papers (2024-07-04T08:21:54Z) - Bridging Domains with Approximately Shared Features [26.096779584142986]
Multi-source domain adaptation aims to reduce performance degradation when applying machine learning models to unseen domains.
Some advocate for learning invariant features from source domains, while others favor more diverse features.
We propose a statistical framework that distinguishes the utilities of features based on the variance of their correlation to label $y$ across domains.
arXiv Detail & Related papers (2024-03-11T04:25:41Z) - Scalable Counterfactual Distribution Estimation in Multivariate Causal
Models [12.88471300865496]
We consider the problem of estimating the counterfactual joint distribution of multiple quantities of interests in a multivariate causal model.
We propose a method that alleviates both issues simultaneously by leveraging a robust latent one-dimensional subspace.
We demonstrate the advantages of our approach over existing methods on both synthetic and real-world data.
arXiv Detail & Related papers (2023-11-02T01:45:44Z) - Multi-Domain Causal Representation Learning via Weak Distributional
Invariances [27.72497122405241]
Causal representation learning has emerged as the center of action in causal machine learning research.
We show that autoencoders that incorporate such invariances can provably identify the stable set of latents from the rest across different settings.
arXiv Detail & Related papers (2023-10-04T14:41:41Z) - Flow Factorized Representation Learning [109.51947536586677]
We introduce a generative model which specifies a distinct set of latent probability paths that define different input transformations.
We show that our model achieves higher likelihoods on standard representation learning benchmarks while simultaneously being closer to approximately equivariant models.
arXiv Detail & Related papers (2023-09-22T20:15:37Z) - On Projectivity in Markov Logic Networks [7.766921168069532]
Logic Logic Networks (MLNs) define a probability distribution on structures over varying domain sizes.
Projective models potentially allow efficient and consistent parameter learning from sub-sampled domains.
arXiv Detail & Related papers (2022-04-08T11:37:53Z) - Attentional Prototype Inference for Few-Shot Segmentation [128.45753577331422]
We propose attentional prototype inference (API), a probabilistic latent variable framework for few-shot segmentation.
We define a global latent variable to represent the prototype of each object category, which we model as a probabilistic distribution.
We conduct extensive experiments on four benchmarks, where our proposal obtains at least competitive and often better performance than state-of-the-art prototype-based methods.
arXiv Detail & Related papers (2021-05-14T06:58:44Z) - Few-shot Image Generation via Cross-domain Correspondence [98.2263458153041]
Training generative models, such as GANs, on a target domain containing limited examples can easily result in overfitting.
In this work, we seek to utilize a large source domain for pretraining and transfer the diversity information from source to target.
To further reduce overfitting, we present an anchor-based strategy to encourage different levels of realism over different regions in the latent space.
arXiv Detail & Related papers (2021-04-13T17:59:35Z) - Generalization Properties of Optimal Transport GANs with Latent
Distribution Learning [52.25145141639159]
We study how the interplay between the latent distribution and the complexity of the pushforward map affects performance.
Motivated by our analysis, we advocate learning the latent distribution as well as the pushforward map within the GAN paradigm.
arXiv Detail & Related papers (2020-07-29T07:31:33Z) - A Complete Characterization of Projectivity for Statistical Relational
Models [20.833623839057097]
We introduce a class of directed latent graphical variable models that precisely correspond to the class of projective relational models.
We also obtain a characterization for when a given distribution over size-$k$ structures is the statistical frequency distribution of size-$k$ sub-structures in much larger size-$n$ structures.
arXiv Detail & Related papers (2020-04-23T05:58:27Z) - GANs with Conditional Independence Graphs: On Subadditivity of
Probability Divergences [70.30467057209405]
Generative Adversarial Networks (GANs) are modern methods to learn the underlying distribution of a data set.
GANs are designed in a model-free fashion where no additional information about the underlying distribution is available.
We propose a principled design of a model-based GAN that uses a set of simple discriminators on the neighborhoods of the Bayes-net/MRF.
arXiv Detail & Related papers (2020-03-02T04:31:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.