Random density matrices: Analytical results for mean fidelity and
variance of squared Bures distance
- URL: http://arxiv.org/abs/2211.05587v1
- Date: Thu, 10 Nov 2022 13:58:27 GMT
- Title: Random density matrices: Analytical results for mean fidelity and
variance of squared Bures distance
- Authors: Aritra Laha and Santosh Kumar
- Abstract summary: We derive exact results for the average fidelity and variance of the squared Bures distance between a fixed density matrix and a random density matrix.
The analytical results are corroborated using Monte Carlo simulations.
- Score: 1.2225709246035374
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One of the key issues in quantum information theory related problems concerns
with that of distinguishability of quantum states. In this context, Bures
distance serves as one of the foremost choices among various distance measures.
It also relates to fidelity, which is another quantity of immense importance in
quantum information theory. In this work, we derive exact results for the
average fidelity and variance of the squared Bures distance between a fixed
density matrix and a random density matrix, and also between two independent
random density matrices. These results supplement the recently obtained results
for the mean root fidelity and mean of squared Bures distance [Phys. Rev. A
104, 022438 (2021)]. The availability of both mean and variance also enables us
to provide a gamma-distribution-based approximation for the probability density
of the squared Bures distance. The analytical results are corroborated using
Monte Carlo simulations. Furthermore, we compare our analytical results with
the mean and variance of the squared Bures distance between reduced density
matrices generated using coupled kicked tops, and a correlated spin chain
system in a random magnetic field. In both cases, we find good agreement.
Related papers
- Exact mean and variance of the squared Hellinger distance for random density matrices [11.495104812547021]
The Hellinger distance between quantum states is a significant measure in quantum information theory.
We propose an approximation for the corresponding probability density function based on the gamma distribution.
arXiv Detail & Related papers (2024-09-22T18:56:49Z) - Anomaly Detection with Variance Stabilized Density Estimation [49.46356430493534]
We present a variance-stabilized density estimation problem for maximizing the likelihood of the observed samples.
To obtain a reliable anomaly detector, we introduce a spectral ensemble of autoregressive models for learning the variance-stabilized distribution.
We have conducted an extensive benchmark with 52 datasets, demonstrating that our method leads to state-of-the-art results.
arXiv Detail & Related papers (2023-06-01T11:52:58Z) - Machine-Learned Exclusion Limits without Binning [0.0]
We extend the Machine-Learned Likelihoods (MLL) method by including Kernel Density Estimators (KDE) to extract one-dimensional signal and background probability density functions.
We apply the method to two cases of interest at the LHC: a search for exotic Higgs bosons, and a $Z'$ boson decaying into lepton pairs.
arXiv Detail & Related papers (2022-11-09T11:04:50Z) - Density Ratio Estimation via Infinitesimal Classification [85.08255198145304]
We propose DRE-infty, a divide-and-conquer approach to reduce Density ratio estimation (DRE) to a series of easier subproblems.
Inspired by Monte Carlo methods, we smoothly interpolate between the two distributions via an infinite continuum of intermediate bridge distributions.
We show that our approach performs well on downstream tasks such as mutual information estimation and energy-based modeling on complex, high-dimensional datasets.
arXiv Detail & Related papers (2021-11-22T06:26:29Z) - Tangent Space and Dimension Estimation with the Wasserstein Distance [10.118241139691952]
Consider a set of points sampled independently near a smooth compact submanifold of Euclidean space.
We provide mathematically rigorous bounds on the number of sample points required to estimate both the dimension and the tangent spaces of that manifold.
arXiv Detail & Related papers (2021-10-12T21:02:06Z) - Kernel distance measures for time series, random fields and other
structured data [71.61147615789537]
kdiff is a novel kernel-based measure for estimating distances between instances of structured data.
It accounts for both self and cross similarities across the instances and is defined using a lower quantile of the distance distribution.
Some theoretical results are provided for separability conditions using kdiff as a distance measure for clustering and classification problems.
arXiv Detail & Related papers (2021-09-29T22:54:17Z) - Meta-Learning for Relative Density-Ratio Estimation [59.75321498170363]
Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities.
We propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets.
We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
arXiv Detail & Related papers (2021-07-02T02:13:45Z) - A Note on Optimizing Distributions using Kernel Mean Embeddings [94.96262888797257]
Kernel mean embeddings represent probability measures by their infinite-dimensional mean embeddings in a reproducing kernel Hilbert space.
We show that when the kernel is characteristic, distributions with a kernel sum-of-squares density are dense.
We provide algorithms to optimize such distributions in the finite-sample setting.
arXiv Detail & Related papers (2021-06-18T08:33:45Z) - Random density matrices: Analytical results for mean root fidelity and
mean square Bures distance [1.0348777118111825]
Bures distance holds a special place among various distance measures due to its several distinguished features.
It is related to fidelity and, among other things, it serves as a bona fide measure for quantifying the separability of quantum states.
arXiv Detail & Related papers (2021-05-06T15:13:14Z) - Wishart and random density matrices: Analytical results for the
mean-square Hilbert-Schmidt distance [1.2225709246035374]
We calculate exact and compact results for the mean square Hilbert-Schmidt distance between a random density matrix and a fixed density matrix.
We also obtain corresponding exact results for the distance between a Wishart matrix and a fixed Hermitian matrix, and two Wishart matrices.
arXiv Detail & Related papers (2020-08-12T07:49:12Z) - Rethink Maximum Mean Discrepancy for Domain Adaptation [77.2560592127872]
This paper theoretically proves two essential facts: 1) minimizing the Maximum Mean Discrepancy equals to maximize the source and target intra-class distances respectively but jointly minimize their variance with some implicit weights, so that the feature discriminability degrades.
Experiments on several benchmark datasets not only prove the validity of theoretical results but also demonstrate that our approach could perform better than the comparative state-of-art methods substantially.
arXiv Detail & Related papers (2020-07-01T18:25:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.