Geometric Gaussian Approximations of Probability Distributions
- URL: http://arxiv.org/abs/2507.00616v1
- Date: Tue, 01 Jul 2025 09:54:43 GMT
- Title: Geometric Gaussian Approximations of Probability Distributions
- Authors: Nathaël Da Costa, Bálint Mucsányi, Philipp Hennig,
- Abstract summary: We study the expressivity of geometric Gaussian approximations.<n>We provide a constructive proof that such approximations are universal.<n>We discuss whether a common diffeomorphism can be found to obtain uniformly high-quality geometric Gaussian approximations.
- Score: 25.021782278452005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Approximating complex probability distributions, such as Bayesian posterior distributions, is of central interest in many applications. We study the expressivity of geometric Gaussian approximations. These consist of approximations by Gaussian pushforwards through diffeomorphisms or Riemannian exponential maps. We first review these two different kinds of geometric Gaussian approximations. Then we explore their relationship to one another. We further provide a constructive proof that such geometric Gaussian approximations are universal, in that they can capture any probability distribution. Finally, we discuss whether, given a family of probability distributions, a common diffeomorphism can be found to obtain uniformly high-quality geometric Gaussian approximations for that family.
Related papers
- Variational Inference with Mixtures of Isotropic Gaussians [8.365869192421865]
Variational inference (VI) is a popular approach in Bayesian inference, that looks for the best approximation of the posterior distribution within a parametric family.<n>We develop a variational framework and provide efficient algorithms suited for this family.
arXiv Detail & Related papers (2025-06-16T15:42:15Z) - Differentiability and Approximation of Probability Functions under Gaussian Mixture Models: A Bayesian Approach [0.0]
We study probability functions associated with Gaussian mixture models.
We use conditional probability distribution to represent the probability function as an integral over the Euclidean sphere.
We approximate the probability function using random sampling over the parameter space and the Euclidean sphere.
arXiv Detail & Related papers (2024-11-05T01:36:27Z) - Bayesian Circular Regression with von Mises Quasi-Processes [57.88921637944379]
In this work we explore a family of expressive and interpretable distributions over circle-valued random functions.<n>For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Gibbs sampling.<n>We present experiments applying this model to the prediction of wind directions and the percentage of the running gait cycle as a function of joint angles.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Which exceptional low-dimensional projections of a Gaussian point cloud can be found in polynomial time? [8.74634652691576]
We study the subset $mathscrF_m,alpha$ of distributions that can be realized by a class of iterative algorithms.<n>Non-rigorous methods from statistical physics yield an indirect characterization of $mathscrF_m,alpha$ in terms of a generalized Parisi formula.
arXiv Detail & Related papers (2024-06-05T05:54:56Z) - Conditioning of Banach Space Valued Gaussian Random Variables: An Approximation Approach Based on Martingales [8.81121308982678]
We investigate the conditional distributions of two Banach space valued, jointly Gaussian random variables.<n>We show that their means and covariances can be determined by a general finite dimensional approximation scheme.<n>We discuss how our approximation scheme can be implemented in several classes of important Banach spaces.
arXiv Detail & Related papers (2024-04-04T13:57:44Z) - Score-based generative models break the curse of dimensionality in
learning a family of sub-Gaussian probability distributions [5.801621787540268]
We introduce a notion of complexity for probability distributions in terms of their relative density with respect to the standard Gaussian measure.
We prove that if the log-relative density can be locally approximated by a neural network whose parameters can be suitably bounded, then the distribution generated by empirical score matching approximates the target distribution.
An essential ingredient of our proof is to derive a dimension-free deep neural network approximation rate for the true score function associated with the forward process.
arXiv Detail & Related papers (2024-02-12T22:02:23Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities [6.499759302108927]
We investigate the use of second-order Poincar'e inequalities as an alternative approach to establish QCLTs for the NN's output.<n>We show how our approach is effective in establishing QCLTs for the NN's output, though it leads to suboptimal rates of convergence.
arXiv Detail & Related papers (2023-04-08T13:52:10Z) - Simplex Random Features [53.97976744884616]
We present Simplex Random Features (SimRFs), a new random feature (RF) mechanism for unbiased approximation of the softmax and Gaussian kernels.
We prove that SimRFs provide the smallest possible mean square error (MSE) on unbiased estimates of these kernels.
We show consistent gains provided by SimRFs in settings including pointwise kernel estimation, nonparametric classification and scalable Transformers.
arXiv Detail & Related papers (2023-01-31T18:53:39Z) - A Nearly Tight Bound for Fitting an Ellipsoid to Gaussian Random Points [50.90125395570797]
This nearly establishes a conjecture ofciteSaundersonCPW12, within logarithmic factors.
The latter conjecture has attracted significant attention over the past decade, due to its connections to machine learning and sum-of-squares lower bounds for certain statistical problems.
arXiv Detail & Related papers (2022-12-21T17:48:01Z) - Theoretical Error Analysis of Entropy Approximation for Gaussian Mixtures [0.6990493129893112]
In this paper, we study the approximate entropy represented as the sum of the entropies of unimodal Gaussian distributions with mixing coefficients.<n>We theoretically analyze the approximation error between the true and the approximate entropy to reveal when this approximation works effectively.<n>Our results provide a guarantee that this approximation works well for high-dimensional problems, such as neural networks.
arXiv Detail & Related papers (2022-02-26T04:49:01Z) - q-Paths: Generalizing the Geometric Annealing Path using Power Means [51.73925445218366]
We introduce $q$-paths, a family of paths which includes the geometric and arithmetic mixtures as special cases.
We show that small deviations away from the geometric path yield empirical gains for Bayesian inference.
arXiv Detail & Related papers (2021-07-01T21:09:06Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Cram\'er-Rao Lower Bounds Arising from Generalized Csisz\'ar Divergences [17.746238062801293]
We study the geometry of probability distributions with respect to a generalized family of Csisz'ar $f$-divergences.
We show that these formulations lead us to find unbiased and efficient estimators for the escort model.
arXiv Detail & Related papers (2020-01-14T13:41:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.