Random density matrices: Analytical results for mean root fidelity and
mean square Bures distance
- URL: http://arxiv.org/abs/2105.02743v2
- Date: Fri, 18 Nov 2022 12:23:03 GMT
- Title: Random density matrices: Analytical results for mean root fidelity and
mean square Bures distance
- Authors: Aritra Laha, Agrim Aggarwal, Santosh Kumar
- Abstract summary: Bures distance holds a special place among various distance measures due to its several distinguished features.
It is related to fidelity and, among other things, it serves as a bona fide measure for quantifying the separability of quantum states.
- Score: 1.0348777118111825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bures distance holds a special place among various distance measures due to
its several distinguished features and finds applications in diverse problems
in quantum information theory. It is related to fidelity and, among other
things, it serves as a bona fide measure for quantifying the separability of
quantum states. In this work, we calculate exact analytical results for the
mean root fidelity and mean square Bures distance between a fixed density
matrix and a random density matrix, and also between two random density
matrices. In the course of derivation, we also obtain spectral density for
product of above pairs of density matrices. We corroborate our analytical
results using Monte Carlo simulations. Moreover, we compare these results with
the mean square Bures distance between reduced density matrices generated using
coupled kicked tops and find very good agreement.
Related papers
- Exact mean and variance of the squared Hellinger distance for random density matrices [11.495104812547021]
The Hellinger distance between quantum states is a significant measure in quantum information theory.
We propose an approximation for the corresponding probability density function based on the gamma distribution.
arXiv Detail & Related papers (2024-09-22T18:56:49Z) - Graph Laplacian-based Bayesian Multi-fidelity Modeling [1.383698759122035]
A graph Laplacian constructed from the low-fidelity data is used to define a multivariate Gaussian prior density.
Few high-fidelity data points are used to construct a conjugate likelihood term.
The results demonstrate that by utilizing a small fraction of high-fidelity data, the multi-fidelity approach can significantly improve the accuracy of a large collection of low-fidelity data points.
arXiv Detail & Related papers (2024-09-12T16:51:55Z) - Random density matrices: Analytical results for mean fidelity and
variance of squared Bures distance [1.2225709246035374]
We derive exact results for the average fidelity and variance of the squared Bures distance between a fixed density matrix and a random density matrix.
The analytical results are corroborated using Monte Carlo simulations.
arXiv Detail & Related papers (2022-11-10T13:58:27Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Kernel distance measures for time series, random fields and other
structured data [71.61147615789537]
kdiff is a novel kernel-based measure for estimating distances between instances of structured data.
It accounts for both self and cross similarities across the instances and is defined using a lower quantile of the distance distribution.
Some theoretical results are provided for separability conditions using kdiff as a distance measure for clustering and classification problems.
arXiv Detail & Related papers (2021-09-29T22:54:17Z) - Mean-Square Analysis with An Application to Optimal Dimension Dependence
of Langevin Monte Carlo [60.785586069299356]
This work provides a general framework for the non-asymotic analysis of sampling error in 2-Wasserstein distance.
Our theoretical analysis is further validated by numerical experiments.
arXiv Detail & Related papers (2021-09-08T18:00:05Z) - Meta-Learning for Relative Density-Ratio Estimation [59.75321498170363]
Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities.
We propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets.
We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
arXiv Detail & Related papers (2021-07-02T02:13:45Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Spectral statistics for the difference of two Wishart matrices [1.2225709246035374]
We derive the joint probability density function of the corresponding eigenvalues in a finite-dimension scenario using two distinct approaches.
We point out the relationship of these results with the corresponding results for difference of two random density matrices and obtain some explicit and closed form expressions for the spectral density and absolute mean.
arXiv Detail & Related papers (2020-11-14T18:43:34Z) - Nonparametric Density Estimation from Markov Chains [68.8204255655161]
We introduce a new nonparametric density estimator inspired by Markov Chains, and generalizing the well-known Kernel Density Estor.
Our estimator presents several benefits with respect to the usual ones and can be used straightforwardly as a foundation in all density-based algorithms.
arXiv Detail & Related papers (2020-09-08T18:33:42Z) - Wishart and random density matrices: Analytical results for the
mean-square Hilbert-Schmidt distance [1.2225709246035374]
We calculate exact and compact results for the mean square Hilbert-Schmidt distance between a random density matrix and a fixed density matrix.
We also obtain corresponding exact results for the distance between a Wishart matrix and a fixed Hermitian matrix, and two Wishart matrices.
arXiv Detail & Related papers (2020-08-12T07:49:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.