Maximal intrinsic randomness of a quantum state
- URL: http://arxiv.org/abs/2307.15708v2
- Date: Thu, 11 Jul 2024 17:16:35 GMT
- Title: Maximal intrinsic randomness of a quantum state
- Authors: Shuyang Meng, Fionnuala Curran, Gabriel Senno, Victoria J. Wright, Máté Farkas, Valerio Scarani, Antonio Acín,
- Abstract summary: Quantum information science has greatly progressed in the study of intrinsic, or secret, quantum randomness in the past decade.
We answer this question for three different randomness quantifiers: the conditional min-entropy, the conditional von Neumann entropy and the conditional max-entropy.
For the conditional von Neumann entropy, the maximal value is $H*= log_2d-S(rho)$, with $S(rho)$ the von Neumann entropy of $rho$, while for the conditional max-entropy, we find the maximal value $
- Score: 1.0470286407954037
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: One of the most counterintuitive aspects of quantum theory is its claim that there is 'intrinsic' randomness in the physical world. Quantum information science has greatly progressed in the study of intrinsic, or secret, quantum randomness in the past decade. With much emphasis on device-independent and semi-device-independent bounds, one of the most basic questions has escaped attention: how much intrinsic randomness can be extracted from a given state $\rho$, and what measurements achieve this bound? We answer this question for three different randomness quantifiers: the conditional min-entropy, the conditional von Neumann entropy and the conditional max-entropy. For the first, we solve the min-max problem of finding the projective measurement that minimises the maximal guessing probability of an eavesdropper. The result is that one can guarantee an amount of conditional min-entropy $H^*_{\textrm{min}}=-\log_2 P_{\textrm{guess}}^{*}(\rho)$ with $P_{\textrm{guess}}^{*}(\rho)=\frac{1}{d}(\textrm{tr} \sqrt{\rho})^2$ by performing suitable projective measurements. For the conditional von Neumann entropy, we find that the maximal value is $H^{*}= \log_{2}d-S(\rho)$, with $S(\rho)$ the von Neumann entropy of $\rho$, while for the conditional max-entropy, we find the maximal value $H^{*}_\textrm{max}=\log_{2}d + \log_{2}\lambda_{\textrm{max}}(\rho)$, where $\lambda_{\textrm{max}}(\rho)$ is the largest eigenvalue of $\rho$. Optimal values for $H^{*}_{\textrm{min}}$, $H^{*}$ and $H^{*}_\textrm{max}$ are achieved by measuring in any basis that is unbiased to the eigenbasis of $\rho$, as well as by other, less intuitive, measurements.
Related papers
- Von Neumann Entropy and Quantum Algorithmic Randomness [0.0]
A state $rho=(rho_n)_n=1infty$ is a sequence such that $rho_n$ is a density matrix on $n$ qubits.
The von Neumann entropy $H(d)$ of a density matrix $d$ is the Shannon entropy of its eigenvalue distribution.
arXiv Detail & Related papers (2024-12-24T18:09:45Z) - On estimating the trace of quantum state powers [2.637436382971936]
We investigate the computational complexity of estimating the trace of quantum state powers $texttr(rhoq)$ for an $n$-qubit mixed quantum state $rho$.
Our speedup is achieved by introducing efficiently computable uniform approximations of positive power functions into quantum singular value transformation.
arXiv Detail & Related papers (2024-10-17T13:57:13Z) - Fast Rates for Maximum Entropy Exploration [52.946307632704645]
We address the challenge of exploration in reinforcement learning (RL) when the agent operates in an unknown environment with sparse or no rewards.
We study the maximum entropy exploration problem two different types.
For visitation entropy, we propose a game-theoretic algorithm that has $widetildemathcalO(H3S2A/varepsilon2)$ sample complexity.
For the trajectory entropy, we propose a simple algorithm that has a sample of complexity of order $widetildemathcalO(mathrmpoly(S,
arXiv Detail & Related papers (2023-03-14T16:51:14Z) - A Quantum Algorithm Framework for Discrete Probability Distributions with Applications to Rényi Entropy Estimation [13.810917492304565]
We propose a unified quantum algorithm framework for estimating properties of discrete probability distributions.
Our framework estimates $alpha$-R'enyi entropy $H_alpha(p)$ to within additive error $epsilon$ with probability at least $2/3$.
arXiv Detail & Related papers (2022-12-03T08:01:55Z) - Near-optimal fitting of ellipsoids to random points [68.12685213894112]
A basic problem of fitting an ellipsoid to random points has connections to low-rank matrix decompositions, independent component analysis, and principal component analysis.
We resolve this conjecture up to logarithmic factors by constructing a fitting ellipsoid for some $n = Omega(, d2/mathrmpolylog(d),)$.
Our proof demonstrates feasibility of the least squares construction of Saunderson et al. using a convenient decomposition of a certain non-standard random matrix.
arXiv Detail & Related papers (2022-08-19T18:00:34Z) - Sublinear quantum algorithms for estimating von Neumann entropy [18.30551855632791]
We study the problem of obtaining estimates to within a multiplicative factor $gamma>1$ of the Shannon entropy of probability distributions and the von Neumann entropy of mixed quantum states.
We work with the quantum purified query access model, which can handle both classical probability distributions and mixed quantum states, and is the most general input model considered in the literature.
arXiv Detail & Related papers (2021-11-22T12:00:45Z) - Unique Games hardness of Quantum Max-Cut, and a conjectured
vector-valued Borell's inequality [6.621324975749854]
We show that the noise stability of a function $f:mathbbRn to -1, 1$ is the expected value of $f(boldsymbolx) cdot f(boldsymboly)$.
We conjecture that the expected value of $langle f(boldsymbolx), f(boldsymboly)rangle$ is minimized by the function $f(x) = x_leq k / Vert x_leq k /
arXiv Detail & Related papers (2021-11-01T20:45:42Z) - Random matrices in service of ML footprint: ternary random features with
no performance loss [55.30329197651178]
We show that the eigenspectrum of $bf K$ is independent of the distribution of the i.i.d. entries of $bf w$.
We propose a novel random technique, called Ternary Random Feature (TRF)
The computation of the proposed random features requires no multiplication and a factor of $b$ less bits for storage compared to classical random features.
arXiv Detail & Related papers (2021-10-05T09:33:49Z) - Optimal Mean Estimation without a Variance [103.26777953032537]
We study the problem of heavy-tailed mean estimation in settings where the variance of the data-generating distribution does not exist.
We design an estimator which attains the smallest possible confidence interval as a function of $n,d,delta$.
arXiv Detail & Related papers (2020-11-24T22:39:21Z) - An Optimal Separation of Randomized and Quantum Query Complexity [67.19751155411075]
We prove that for every decision tree, the absolute values of the Fourier coefficients of a given order $ellsqrtbinomdell (1+log n)ell-1,$ sum to at most $cellsqrtbinomdell (1+log n)ell-1,$ where $n$ is the number of variables, $d$ is the tree depth, and $c>0$ is an absolute constant.
arXiv Detail & Related papers (2020-08-24T06:50:57Z) - Maximizing Determinants under Matroid Constraints [69.25768526213689]
We study the problem of finding a basis $S$ of $M$ such that $det(sum_i in Sv_i v_i v_itop)$ is maximized.
This problem appears in a diverse set of areas such as experimental design, fair allocation of goods, network design, and machine learning.
arXiv Detail & Related papers (2020-04-16T19:16:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.