Tree-structured Markov random fields with Poisson marginal distributions
- URL: http://arxiv.org/abs/2408.13649v1
- Date: Sat, 24 Aug 2024 18:30:15 GMT
- Title: Tree-structured Markov random fields with Poisson marginal distributions
- Authors: Benjamin Côté, Hélène Cossette, Etienne Marceau,
- Abstract summary: A new family of tree-structured random fields for a vector of discrete counting random variables is introduced.
The marginal Poisson random fields are all with the same mean, and are untied from the strength or structure of their built-in dependence.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A new family of tree-structured Markov random fields for a vector of discrete counting random variables is introduced. According to the characteristics of the family, the marginal distributions of the Markov random fields are all Poisson with the same mean, and are untied from the strength or structure of their built-in dependence. This key feature is uncommon for Markov random fields and most convenient for applications purposes. The specific properties of this new family confer a straightforward sampling procedure and analytic expressions for the joint probability mass function and the joint probability generating function of the vector of counting random variables, thus granting computational methods that scale well to vectors of high dimension. We study the distribution of the sum of random variables constituting a Markov random field from the proposed family, analyze a random variable's individual contribution to that sum through expected allocations, and establish stochastic orderings to assess a wide understanding of their behavior.
Related papers
- Characteristic Function of the Tsallis $q$-Gaussian and Its Applications
in Measurement and Metrology [0.0]
The Tsallis $q$-Gaussian distribution is a powerful generalization of the standard Gaussian distribution.
This paper presents the characteristic function of a linear combination of independent $q$-Gaussian random variables.
It provides an alternative computational procedure to the Monte Carlo method for uncertainty analysis.
arXiv Detail & Related papers (2023-03-15T13:42:35Z) - Simplex Random Features [53.97976744884616]
We present Simplex Random Features (SimRFs), a new random feature (RF) mechanism for unbiased approximation of the softmax and Gaussian kernels.
We prove that SimRFs provide the smallest possible mean square error (MSE) on unbiased estimates of these kernels.
We show consistent gains provided by SimRFs in settings including pointwise kernel estimation, nonparametric classification and scalable Transformers.
arXiv Detail & Related papers (2023-01-31T18:53:39Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - Robust Estimation for Nonparametric Families via Generative Adversarial
Networks [92.64483100338724]
We provide a framework for designing Generative Adversarial Networks (GANs) to solve high dimensional robust statistics problems.
Our work extend these to robust mean estimation, second moment estimation, and robust linear regression.
In terms of techniques, our proposed GAN losses can be viewed as a smoothed and generalized Kolmogorov-Smirnov distance.
arXiv Detail & Related papers (2022-02-02T20:11:33Z) - A Stochastic Newton Algorithm for Distributed Convex Optimization [62.20732134991661]
We analyze a Newton algorithm for homogeneous distributed convex optimization, where each machine can calculate gradients of the same population objective.
We show that our method can reduce the number, and frequency, of required communication rounds compared to existing methods without hurting performance.
arXiv Detail & Related papers (2021-10-07T17:51:10Z) - Degenerate Gaussian factors for probabilistic inference [0.0]
We propose a parametrised factor that enables inference on Gaussian networks where linear dependencies exist among the random variables.
By using this principled factor definition, degeneracies can be accommodated accurately and automatically at little additional computational cost.
arXiv Detail & Related papers (2021-04-30T13:58:29Z) - Goal-oriented adaptive sampling under random field modelling of response
probability distributions [0.6445605125467573]
We consider cases where the spatial variation of response distributions does not only concern their mean and/or variance but also other features including for instance shape or uni-modality versus multi-modality.
Our contributions build upon a non-parametric Bayesian approach to modelling the thereby induced fields of probability distributions.
arXiv Detail & Related papers (2021-02-15T15:55:23Z) - An Embedded Model Estimator for Non-Stationary Random Functions using
Multiple Secondary Variables [0.0]
This paper introduces the method and shows that it has consistency results that are similar in nature to those applying to geostatistical modelling and to Quantile Random Forests.
The algorithm works by estimating a conditional distribution for the target variable at each target location.
arXiv Detail & Related papers (2020-11-09T00:14:24Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z) - Probabilistic Contraction Analysis of Iterated Random Operators [10.442391859219807]
Banach contraction mapping theorem is employed to establish the convergence of certain deterministic algorithms.
In a class of randomized algorithms, in each iteration, the contraction map is approximated with an operator that uses independent and identically distributed samples of certain random variables.
This leads to iterated random operators acting on an initial point in a complete metric space, and it generates a Markov chain.
arXiv Detail & Related papers (2018-04-04T00:10:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.