Uncertainty relations for unified ($α$,$β$)-relative entropy of coherence under mutually unbiased equiangular tight frames
- URL: http://arxiv.org/abs/2506.09779v1
- Date: Wed, 11 Jun 2025 14:18:46 GMT
- Title: Uncertainty relations for unified ($α$,$β$)-relative entropy of coherence under mutually unbiased equiangular tight frames
- Authors: Baolong Cheng, Zhaoqi Wu,
- Abstract summary: Uncertainty relations based on quantum coherence are an important problem in quantum information science.<n>We discuss uncertainty relations for averaged unified ($alpha$,$beta$)-relative entropy of coherence under mutually unbiased equiangular tight frames.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Uncertainty relations based on quantum coherence is an important problem in quantum information science. We discuss uncertainty relations for averaged unified ($\alpha$,$\beta$)-relative entropy of coherence under mutually unbiased equiangular tight frames, and derive an interesting result for different parameters. As consequences, we obtain corresponding results under mutually unbiased bases, equiangular tight frames or based on Tsallis $\alpha$- relative entropies and R\'enyi-$\alpha$ relative entropies. We illustrate the derived inequalities by explicit examples in two dimensional spaces, showing that the lower bounds can be regarded as good approximations to averaged coherence quantifiers under certain circumstances.
Related papers
- An entropic uncertainty principle for mixed states [0.0]
We provide a family of generalizations of the entropic uncertainty principle.
Results can be used to certify entanglement between trusted parties, or to bound the entanglement of a system with an untrusted environment.
arXiv Detail & Related papers (2023-03-20T18:31:53Z) - The Shrinkage-Delinkage Trade-off: An Analysis of Factorized Gaussian
Approximations for Variational Inference [3.167685495996986]
We consider two popular ways to measure the uncertainty deficit of variational inference (VI)
We prove that $q$ always underestimates both the componentwise variance and the entropy of $p$.
We study various manifestations of this trade-off, notably one where, as the dimension of the problem grows, the per-component entropy gap becomes vanishingly small.
arXiv Detail & Related papers (2023-02-17T22:21:47Z) - Better Heisenberg limits, coherence bounds, and energy-time tradeoffs
via quantum R\'enyi information [0.0]
An uncertainty relation for the R'enyi entropies of conjugate quantum observables is used.
$f(alpha)$ is maximised for non-Shannon entropies.
arXiv Detail & Related papers (2022-10-26T10:40:54Z) - Tight Exponential Analysis for Smoothing the Max-Relative Entropy and
for Quantum Privacy Amplification [56.61325554836984]
The max-relative entropy together with its smoothed version is a basic tool in quantum information theory.
We derive the exact exponent for the decay of the small modification of the quantum state in smoothing the max-relative entropy based on purified distance.
arXiv Detail & Related papers (2021-11-01T16:35:41Z) - Entropic uncertainty relations for mutually unbiased periodic
coarse-grained observables resemble their discrete counterparts [0.0]
In a $d$ dimensional system and two mutually unbiased measurements, the sum of two information entropies is lower bounded by $ln d$.
It has recently been shown that projective measurements subject to operational mutual unbiasedness can also be constructed in a continuous domain.
Here we consider the whole family of R'enyi entropies applied to these discretized observables and prove that such a scheme does also admit the uncertainty relation mentioned above.
arXiv Detail & Related papers (2021-07-26T18:48:26Z) - R\'enyi divergence inequalities via interpolation, with applications to
generalised entropic uncertainty relations [91.3755431537592]
We investigate quantum R'enyi entropic quantities, specifically those derived from'sandwiched' divergence.
We present R'enyi mutual information decomposition rules, a new approach to the R'enyi conditional entropy tripartite chain rules and a more general bipartite comparison.
arXiv Detail & Related papers (2021-06-19T04:06:23Z) - Spatially relaxed inference on high-dimensional linear models [48.989769153211995]
We study the properties of ensembled clustered inference algorithms which combine spatially constrained clustering, statistical inference, and ensembling to aggregate several clustered inference solutions.
We show that ensembled clustered inference algorithms control the $delta$-FWER under standard assumptions for $delta$ equal to the largest cluster diameter.
arXiv Detail & Related papers (2021-06-04T16:37:19Z) - Attainability and lower semi-continuity of the relative entropy of
entanglement, and variations on the theme [8.37609145576126]
The relative entropy of entanglement $E_Rite is defined as the distance of a multi-part quantum entanglement from the set of separable states as measured by the quantum relative entropy.
We show that this state is always achieved, i.e. any state admits a closest separable state, even in dimensions; also, $E_Rite is everywhere lower semi-negative $lambda_$quasi-probability distribution.
arXiv Detail & Related papers (2021-05-17T18:03:02Z) - Sequential Estimation of Convex Divergences using Reverse Submartingales
and Exchangeable Filtrations [31.088836418378534]
We present a unified technique for sequential estimation of convex divergences between distributions.
The technical underpinnings of our approach lie in the observation that empirical convex divergences are (partially ordered) reverse submartingales.
These techniques appear to be powerful additions to the existing literature on both confidence sequences and convex divergences.
arXiv Detail & Related papers (2021-03-16T18:22:14Z) - Relative entropic uncertainty relation [0.0]
We find that a sum of relative entropies is bounded from above in a nontrivial way.
This type of entropic uncertainty relation can be applied directly to observables with either discrete or continuous spectra.
arXiv Detail & Related papers (2020-12-18T07:19:25Z) - A Weaker Faithfulness Assumption based on Triple Interactions [89.59955143854556]
We propose a weaker assumption that we call $2$-adjacency faithfulness.
We propose a sound orientation rule for causal discovery that applies under weaker assumptions.
arXiv Detail & Related papers (2020-10-27T13:04:08Z) - Entropy and relative entropy from information-theoretic principles [24.74754293747645]
We find that every relative entropy must lie between the R'enyi divergences of order $0$ and $infty$.
Our main result is a one-to-one correspondence between entropies and relative entropies.
arXiv Detail & Related papers (2020-06-19T14:50:44Z) - Generalized Bayesian Cram\'{e}r-Rao Inequality via Information Geometry
of Relative $\alpha$-Entropy [17.746238062801293]
relative $alpha$-entropy is the R'enyi analog of relative entropy.
Recent information geometric investigations on this quantity have enabled the generalization of the Cram'er-Rao inequality.
We show that in the limiting case when the entropy order approaches unity, this framework reduces to the conventional Bayesian Cram'er-Rao inequality.
arXiv Detail & Related papers (2020-02-11T23:38:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.