Differentiability and Approximation of Probability Functions under Gaussian Mixture Models: A Bayesian Approach
- URL: http://arxiv.org/abs/2411.02721v1
- Date: Tue, 05 Nov 2024 01:36:27 GMT
- Title: Differentiability and Approximation of Probability Functions under Gaussian Mixture Models: A Bayesian Approach
- Authors: Gonzalo Contador, Pedro PĂ©rez-Aros, Emilio Vilches,
- Abstract summary: We study probability functions associated with Gaussian mixture models.
We use conditional probability distribution to represent the probability function as an integral over the Euclidean sphere.
We approximate the probability function using random sampling over the parameter space and the Euclidean sphere.
- Score: 0.0
- License:
- Abstract: In this work, we study probability functions associated with Gaussian mixture models. Our primary focus is on extending the use of spherical radial decomposition for multivariate Gaussian random vectors to the context of Gaussian mixture models, which are not inherently spherical but only conditionally so. Specifically, the conditional probability distribution, given a random parameter of the random vector, follows a Gaussian distribution, allowing us to apply Bayesian analysis tools to the probability function. This assumption, together with spherical radial decomposition for Gaussian random vectors, enables us to represent the probability function as an integral over the Euclidean sphere. Using this representation, we establish sufficient conditions to ensure the differentiability of the probability function and provide and integral representation of its gradient. Furthermore, leveraging the Bayesian decomposition, we approximate the probability function using random sampling over the parameter space and the Euclidean sphere. Finally, we present numerical examples that illustrate the advantages of this approach over classical approximations based on random vector sampling.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Characteristic Function of the Tsallis $q$-Gaussian and Its Applications
in Measurement and Metrology [0.0]
The Tsallis $q$-Gaussian distribution is a powerful generalization of the standard Gaussian distribution.
This paper presents the characteristic function of a linear combination of independent $q$-Gaussian random variables.
It provides an alternative computational procedure to the Monte Carlo method for uncertainty analysis.
arXiv Detail & Related papers (2023-03-15T13:42:35Z) - Simplex Random Features [53.97976744884616]
We present Simplex Random Features (SimRFs), a new random feature (RF) mechanism for unbiased approximation of the softmax and Gaussian kernels.
We prove that SimRFs provide the smallest possible mean square error (MSE) on unbiased estimates of these kernels.
We show consistent gains provided by SimRFs in settings including pointwise kernel estimation, nonparametric classification and scalable Transformers.
arXiv Detail & Related papers (2023-01-31T18:53:39Z) - B\'ezier Curve Gaussian Processes [8.11969931278838]
This paper proposes a new probabilistic sequence model building on probabilistic B'ezier curves.
Combined with a Mixture Density network, Bayesian conditional inference can be performed without the need for mean field variational approximation.
The model is used for pedestrian trajectory prediction, where a generated prediction also serves as a GP prior.
arXiv Detail & Related papers (2022-05-03T19:49:57Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - A Stochastic Newton Algorithm for Distributed Convex Optimization [62.20732134991661]
We analyze a Newton algorithm for homogeneous distributed convex optimization, where each machine can calculate gradients of the same population objective.
We show that our method can reduce the number, and frequency, of required communication rounds compared to existing methods without hurting performance.
arXiv Detail & Related papers (2021-10-07T17:51:10Z) - Riemannian Gaussian distributions, random matrix ensembles and diffusion
kernels [0.0]
We show how to compute marginals of the probability density functions on a random matrix type of symmetric spaces.
We also show how the probability density functions are a particular case of diffusion kernels of the Karlin-McGregor type, describing non-intersecting processes in the Weyl chamber of Lie groups.
arXiv Detail & Related papers (2020-11-27T11:41:29Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Stochastic Saddle-Point Optimization for Wasserstein Barycenters [69.68068088508505]
We consider the populationimation barycenter problem for random probability measures supported on a finite set of points and generated by an online stream of data.
We employ the structure of the problem and obtain a convex-concave saddle-point reformulation of this problem.
In the setting when the distribution of random probability measures is discrete, we propose an optimization algorithm and estimate its complexity.
arXiv Detail & Related papers (2020-06-11T19:40:38Z) - Multiplicative Gaussian Particle Filter [18.615555573235987]
We propose a new sampling-based approach for approximate inference in filtering problems.
Instead of approximating conditional distributions with a finite set of states, as done in particle filters, our approach approximates the distribution with a weighted sum of functions from a set of continuous functions.
arXiv Detail & Related papers (2020-02-29T09:19:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.