Quantum simulations of macrorealism violation via the QNDM protocol
- URL: http://arxiv.org/abs/2502.17040v1
- Date: Mon, 24 Feb 2025 10:47:52 GMT
- Title: Quantum simulations of macrorealism violation via the QNDM protocol
- Authors: D. Melegari, M. Cardi, P. Solinas,
- Abstract summary: We compare the non-demolition approach with the Leggett-Garg inequalities.<n>We find that the non-demolition approach is incredibly robust and its efficiency remains unchanged by the noise.<n>These results make the non-demolition approach a viable alternative to identify the violation of macrorealism.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The Leggett-Garg inequalities have been proposed to identify the quantum behaviour of a system; specifically, the violation of macrorealism. They are usually implemented by performing two sequential measurements on quantum systems, calculating the correlators of such measurements and then combining them arriving at Leggett-Garg inequalities. However, this approach only provides sufficient conditions for the violation of macrorealism. Recently, it was proposed an alternative approach that uses non-demolition measurements and gives both a necessary and sufficient condition for the violation of macrorealism. By storing the information in a quantum detector, it is possible to construct a quasi-probability distribution whose negative regions unequivocally identify the quantum behaviour of the system. Here, we perform a detailed comparison between these two approaches. The use of the IBM quantum simulators allows us to evaluate the performance in real-case situations and to include both the statistical and environmental noise. We find that the non-demolition approach is not only able to always identify the quantum features but it requires fewer resources than the standard Leggett-Garg inequalities. In addition, while the efficiency of the latter is strongly affected by the presence of the noise, the non-demolition approach results incredibly robust and its efficiency remains unchanged by the noise. These results make the non-demolition approach a viable alternative to the Leggett-Garg inequalities to identify the violation of macrorealism.
Related papers
- Bayesian Quantum Amplitude Estimation [49.1574468325115]
We introduce BAE, a noise-aware Bayesian algorithm for quantum amplitude estimation.<n>We show that BAE achieves Heisenberg-limited estimation and benchmark it against other approaches.
arXiv Detail & Related papers (2024-12-05T18:09:41Z) - Quantum Non-Demolition Measurements and Leggett-Garg inequality [0.0]
Quantum non-demolition measurements define a non-invasive protocol to extract information from a quantum system.
This protocol leads to a quasi-probability distribution for the measured observable outcomes, which can be negative.
We show that there are situations in which Leggett-Garg inequalities are satisfied even if the macrorealism condition is violated.
arXiv Detail & Related papers (2024-07-31T18:04:51Z) - Semi-device independent nonlocality certification for near-term quantum
networks [46.37108901286964]
Bell tests are the most rigorous method for verifying entanglement in quantum networks.
If there is any signaling between the parties, then the violation of Bell inequalities can no longer be used.
We propose a semi-device independent protocol that allows us to numerically correct for effects of correlations in experimental probability distributions.
arXiv Detail & Related papers (2023-05-23T14:39:08Z) - Enhancing Quantum Computation via Superposition of Quantum Gates [1.732837834702512]
We present different protocols, which we denote as "superposed quantum error mitigation"
We show that significant noise suppression can be achieved for most kinds of decoherence and standard experimental parameter regimes.
We analyze our approach for gate-based, measurement-based and interferometric-based models.
arXiv Detail & Related papers (2023-04-17T18:01:05Z) - Potential and limitations of quantum extreme learning machines [55.41644538483948]
We present a framework to model QRCs and QELMs, showing that they can be concisely described via single effective measurements.
Our analysis paves the way to a more thorough understanding of the capabilities and limitations of both QELMs and QRCs.
arXiv Detail & Related papers (2022-10-03T09:32:28Z) - Suppressing Amplitude Damping in Trapped Ions: Discrete Weak
Measurements for a Non-unitary Probabilistic Noise Filter [62.997667081978825]
We introduce a low-overhead protocol to reverse this degradation.
We present two trapped-ion schemes for the implementation of a non-unitary probabilistic filter against amplitude damping noise.
This filter can be understood as a protocol for single-copy quasi-distillation.
arXiv Detail & Related papers (2022-09-06T18:18:41Z) - Improved Quantum Algorithms for Fidelity Estimation [77.34726150561087]
We develop new and efficient quantum algorithms for fidelity estimation with provable performance guarantees.
Our algorithms use advanced quantum linear algebra techniques, such as the quantum singular value transformation.
We prove that fidelity estimation to any non-trivial constant additive accuracy is hard in general.
arXiv Detail & Related papers (2022-03-30T02:02:16Z) - Experimental violations of Leggett-Garg's inequalities on a quantum
computer [77.34726150561087]
We experimentally observe the violations of Leggett-Garg-Bell's inequalities on single and multi-qubit systems.
Our analysis highlights the limits of nowadays quantum platforms, showing that the above-mentioned correlation functions deviate from theoretical prediction as the number of qubits and the depth of the circuit grow.
arXiv Detail & Related papers (2021-09-06T14:35:15Z) - Estimating distinguishability measures on quantum computers [4.779196219827506]
We propose and review several algorithms for estimating distinguishability measures based on trace distance and fidelity.
The fidelity-based algorithms offer novel physical interpretations of these distinguishability measures.
We find that the simulations converge well in both the noiseless and noisy scenarios.
arXiv Detail & Related papers (2021-08-18T22:32:31Z) - Optimal Provable Robustness of Quantum Classification via Quantum
Hypothesis Testing [14.684867444153625]
Quantum machine learning models have the potential to offer speedups and better predictive accuracy compared to their classical counterparts.
These quantum algorithms, like their classical counterparts, have been shown to be vulnerable to input perturbations.
These can arise either from noisy implementations or, as a worst-case type of noise, adversarial attacks.
arXiv Detail & Related papers (2020-09-21T17:55:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.