Hardware requirements for realizing a quantum advantage with
deterministic single-photon sources
- URL: http://arxiv.org/abs/2310.10185v1
- Date: Mon, 16 Oct 2023 08:48:22 GMT
- Title: Hardware requirements for realizing a quantum advantage with
deterministic single-photon sources
- Authors: Patrik I. Sund, Ravitej Uppu, Stefano Paesani, Peter Lodahl
- Abstract summary: We analyse and detail hardware requirements needed to reach quantum advantage with deterministic quantum emitters.
We find that quantum advantage is within reach using quantum emitters with an efficiency of 60%-70% and interferometers constructed according to a hybrid-mode-encoding architecture.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Boson sampling is a specialised algorithm native to the quantum photonic
platform developed for near-term demonstrations of quantum advantage over
classical computers. While clear useful applications for such near-term
pre-fault-tolerance devices are not currently known, reaching a quantum
advantage regime serves as a useful benchmark for the hardware. Here, we
analyse and detail hardware requirements needed to reach quantum advantage with
deterministic quantum emitters, a promising platform for photonic quantum
computing. We elucidate key steps that can be taken in experiments to overcome
practical constraints and establish quantitative hardware-level requirements.
We find that quantum advantage is within reach using quantum emitters with an
efficiency of 60%-70% and interferometers constructed according to a
hybrid-mode-encoding architecture, constituted of Mach-Zehnder interferometers
with an insertion loss of 0.0035 (a transmittance of 99.92%) per component.
Related papers
- A Quantum-Classical Collaborative Training Architecture Based on Quantum
State Fidelity [50.387179833629254]
We introduce a collaborative classical-quantum architecture called co-TenQu.
Co-TenQu enhances a classical deep neural network by up to 41.72% in a fair setting.
It outperforms other quantum-based methods by up to 1.9 times and achieves similar accuracy while utilizing 70.59% fewer qubits.
arXiv Detail & Related papers (2024-02-23T14:09:41Z) - Near-Term Distributed Quantum Computation using Mean-Field Corrections
and Auxiliary Qubits [77.04894470683776]
We propose near-term distributed quantum computing that involve limited information transfer and conservative entanglement production.
We build upon these concepts to produce an approximate circuit-cutting technique for the fragmented pre-training of variational quantum algorithms.
arXiv Detail & Related papers (2023-09-11T18:00:00Z) - Assessing requirements to scale to practical quantum advantage [56.22441723982983]
We develop a framework for quantum resource estimation, abstracting the layers of the stack, to estimate resources required for large-scale quantum applications.
We assess three scaled quantum applications and find that hundreds of thousands to millions of physical qubits are needed to achieve practical quantum advantage.
A goal of our work is to accelerate progress towards practical quantum advantage by enabling the broader community to explore design choices across the stack.
arXiv Detail & Related papers (2022-11-14T18:50:27Z) - Anticipative measurements in hybrid quantum-classical computation [68.8204255655161]
We present an approach where the quantum computation is supplemented by a classical result.
Taking advantage of its anticipation also leads to a new type of quantum measurements, which we call anticipative.
In an anticipative quantum measurement the combination of the results from classical and quantum computations happens only in the end.
arXiv Detail & Related papers (2022-09-12T15:47:44Z) - Quantum reservoir neural network implementation on coherently coupled
quantum oscillators [1.7086737326992172]
We propose an implementation for quantum reservoir that obtains a large number of densely connected neurons.
We analyse a specific hardware implementation based on superconducting circuits.
We obtain state-of-the-art accuracy of 99 % on benchmark tasks.
arXiv Detail & Related papers (2022-09-07T15:24:51Z) - Experimental optimal verification of three-dimensional entanglement on a
silicon chip [3.9805421324529133]
We experimentally implement an optimal quantum verification strategy on a three-dimensional maximally entangled state.
A 95% confidence is achieved from 1190 copies to verify the target quantum state.
Our results indicate that quantum state verification could serve as an efficient tool for complex quantum measurement tasks.
arXiv Detail & Related papers (2022-08-27T07:45:21Z) - Quantum Volume for Photonic Quantum Processors [15.3862808585761]
Defining metrics for near-term quantum computing processors has been an integral part of the quantum hardware research and development efforts.
Most metrics such as randomized benchmarking and quantum volume were originally introduced for circuit-based quantum computers.
We present a framework to map physical noises and imperfections in MBQC processes to logical errors in equivalent quantum circuits.
arXiv Detail & Related papers (2022-08-24T18:05:16Z) - Circuit Symmetry Verification Mitigates Quantum-Domain Impairments [69.33243249411113]
We propose circuit-oriented symmetry verification that are capable of verifying the commutativity of quantum circuits without the knowledge of the quantum state.
In particular, we propose the Fourier-temporal stabilizer (STS) technique, which generalizes the conventional quantum-domain formalism to circuit-oriented stabilizers.
arXiv Detail & Related papers (2021-12-27T21:15:35Z) - Quantum circuit architecture search for variational quantum algorithms [88.71725630554758]
We propose a resource and runtime efficient scheme termed quantum architecture search (QAS)
QAS automatically seeks a near-optimal ansatz to balance benefits and side-effects brought by adding more noisy quantum gates.
We implement QAS on both the numerical simulator and real quantum hardware, via the IBM cloud, to accomplish data classification and quantum chemistry tasks.
arXiv Detail & Related papers (2020-10-20T12:06:27Z) - Minimizing estimation runtime on noisy quantum computers [0.0]
"engineered likelihood function" (ELF) is used for carrying out Bayesian inference.
We show how the ELF formalism enhances the rate of information gain in sampling as the physical hardware transitions from the regime of noisy quantum computers.
This technique speeds up a central component of many quantum algorithms, with applications including chemistry, materials, finance, and beyond.
arXiv Detail & Related papers (2020-06-16T17:46:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.