A numerical approximation method for the Fisher-Rao distance between
multivariate normal distributions
- URL: http://arxiv.org/abs/2302.08175v6
- Date: Mon, 27 Mar 2023 10:59:42 GMT
- Title: A numerical approximation method for the Fisher-Rao distance between
multivariate normal distributions
- Authors: Frank Nielsen
- Abstract summary: We use discretizing curves joining normal distributions and approximating Rao's distances between successive nearby normal distributions on the curves by the square root of Jeffreys divergence.
We report on our experiments and assess the quality of our approximation technique by comparing the numerical approximations with both lower and upper bounds.
- Score: 12.729120803225065
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a simple method to approximate Rao's distance between multivariate
normal distributions based on discretizing curves joining normal distributions
and approximating Rao's distances between successive nearby normal
distributions on the curves by the square root of Jeffreys divergence, the
symmetrized Kullback-Leibler divergence. We consider experimentally the linear
interpolation curves in the ordinary, natural and expectation parameterizations
of the normal distributions, and compare these curves with a curve derived from
the Calvo and Oller's isometric embedding of the Fisher-Rao $d$-variate normal
manifold into the cone of $(d+1)\times (d+1)$ symmetric positive-definite
matrices [Journal of multivariate analysis 35.2 (1990): 223-242]. We report on
our experiments and assess the quality of our approximation technique by
comparing the numerical approximations with both lower and upper bounds.
Finally, we present several information-geometric properties of the Calvo and
Oller's isometric embedding.
Related papers
- A Bayesian Approach Toward Robust Multidimensional Ellipsoid-Specific Fitting [0.0]
This work presents a novel and effective method for fitting multidimensional ellipsoids to scattered data in the contamination of noise and outliers.
We incorporate a uniform prior distribution to constrain the search for primitive parameters within an ellipsoidal domain.
We apply it to a wide range of practical applications such as microscopy cell counting, 3D reconstruction, geometric shape approximation, and magnetometer calibration tasks.
arXiv Detail & Related papers (2024-07-27T14:31:51Z) - Multivariate root-n-consistent smoothing parameter free matching estimators and estimators of inverse density weighted expectations [51.000851088730684]
We develop novel modifications of nearest-neighbor and matching estimators which converge at the parametric $sqrt n $-rate.
We stress that our estimators do not involve nonparametric function estimators and in particular do not rely on sample-size dependent parameters smoothing.
arXiv Detail & Related papers (2024-07-11T13:28:34Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Fisher-Rao distance and pullback SPD cone distances between multivariate normal distributions [7.070726553564701]
We introduce a class of distances based on diffeomorphic embeddings of the normal manifold into a submanifold.
We show that the projective Hilbert distance on the cone yields a metric on the embedded normal submanifold.
We show how to use those distances in clustering tasks.
arXiv Detail & Related papers (2023-07-20T07:14:58Z) - Curvature-Independent Last-Iterate Convergence for Games on Riemannian
Manifolds [77.4346324549323]
We show that a step size agnostic to the curvature of the manifold achieves a curvature-independent and linear last-iterate convergence rate.
To the best of our knowledge, the possibility of curvature-independent rates and/or last-iterate convergence has not been considered before.
arXiv Detail & Related papers (2023-06-29T01:20:44Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - How Good are Low-Rank Approximations in Gaussian Process Regression? [28.392890577684657]
We provide guarantees for approximate Gaussian Process (GP) regression resulting from two common low-rank kernel approximations.
We provide experiments on both simulated data and standard benchmarks to evaluate the effectiveness of our theoretical bounds.
arXiv Detail & Related papers (2021-12-13T04:04:08Z) - Lower Bounds on the Total Variation Distance Between Mixtures of Two
Gaussians [45.392805695921666]
We exploit a connection between total variation distance and the characteristic function of the mixture.
We derive new lower bounds on the total variation distance between pairs of two-component Gaussian mixtures.
arXiv Detail & Related papers (2021-09-02T16:32:16Z) - Optimal oracle inequalities for solving projected fixed-point equations [53.31620399640334]
We study methods that use a collection of random observations to compute approximate solutions by searching over a known low-dimensional subspace of the Hilbert space.
We show how our results precisely characterize the error of a class of temporal difference learning methods for the policy evaluation problem with linear function approximation.
arXiv Detail & Related papers (2020-12-09T20:19:32Z) - On the Theoretical Equivalence of Several Trade-Off Curves Assessing
Statistical Proximity [4.626261940793027]
We propose a unification of four curves known respectively as: the precision-recall (PR) curve, the Lorenz curve, the receiver operating characteristic (ROC) curve and a special case of R'enyi divergence frontiers.
In addition, we discuss possible links between PR / Lorenz curves with the derivation of domain adaptation bounds.
arXiv Detail & Related papers (2020-06-21T14:32:38Z) - Minimax Optimal Estimation of KL Divergence for Continuous Distributions [56.29748742084386]
Esting Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains.
One simple and effective estimator is based on the k nearest neighbor between these samples.
arXiv Detail & Related papers (2020-02-26T16:37:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.