Hybrid and Generalized Bayesian Cram\'{e}r-Rao Inequalities via
Information Geometry
- URL: http://arxiv.org/abs/2104.01061v1
- Date: Fri, 2 Apr 2021 14:21:49 GMT
- Title: Hybrid and Generalized Bayesian Cram\'{e}r-Rao Inequalities via
Information Geometry
- Authors: Kumar Vijay Mishra and M. Ashok Kumar
- Abstract summary: This chapter summarizes the recent results which extend this framework to more general Cram'er-Rao inequalities.
We apply Eguchi's theory to a generalized form of Czsisz'ar $f$-divergence.
- Score: 15.33401602207049
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Information geometry is the study of statistical models from a Riemannian
geometric point of view. The Fisher information matrix plays the role of a
Riemannian metric in this framework. This tool helps us obtain Cram\'{e}r-Rao
lower bound (CRLB). This chapter summarizes the recent results which extend
this framework to more general Cram\'{e}r-Rao inequalities. We apply Eguchi's
theory to a generalized form of Czsisz\'ar $f$-divergence to obtain a
Riemannian metric that, at once, is used to obtain deterministic CRLB, Bayesian
CRLB, and their generalizations.
Related papers
- RMLR: Extending Multinomial Logistic Regression into General Geometries [64.16104856124029]
Our framework only requires minimal geometric properties, thus exhibiting broad applicability.
We develop five families of SPD MLRs under five types of power-deformed metrics.
On rotation matrices we propose Lie MLR based on the popular bi-invariant metric.
arXiv Detail & Related papers (2024-09-28T18:38:21Z) - Understanding Matrix Function Normalizations in Covariance Pooling through the Lens of Riemannian Geometry [63.694184882697435]
Global Covariance Pooling (GCP) has been demonstrated to improve the performance of Deep Neural Networks (DNNs) by exploiting second-order statistics of high-level representations.
arXiv Detail & Related papers (2024-07-15T07:11:44Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Principal subbundles for dimension reduction [0.07515511160657122]
We show how sub-Riemannian geometry can be used for manifold learning and surface reconstruction.
We show that the framework is robust when applied to noisy data.
arXiv Detail & Related papers (2023-07-06T16:55:21Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - Cram\'er-Rao Lower Bounds Arising from Generalized Csisz\'ar Divergences [17.746238062801293]
We study the geometry of probability distributions with respect to a generalized family of Csisz'ar $f$-divergences.
We show that these formulations lead us to find unbiased and efficient estimators for the escort model.
arXiv Detail & Related papers (2020-01-14T13:41:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.