Linear cross-entropy certification of quantum computational advantage in Gaussian Boson Sampling
- URL: http://arxiv.org/abs/2403.15339v2
- Date: Wed, 02 Oct 2024 15:15:28 GMT
- Title: Linear cross-entropy certification of quantum computational advantage in Gaussian Boson Sampling
- Authors: Javier Martínez-Cifuentes, Hubert de Guise, Nicolás Quesada,
- Abstract summary: We argue that one can avoid this issue by validating GBS implementations using their corresponding ideal distributions directly.
We explain how to use a modified version of the linear cross-entropy, a measure that we call the LXE score, to find reference values that help us assess how close a given GBS implementation is to its corresponding ideal model.
- Score: 0.0
- License:
- Abstract: Validation of quantum advantage claims in the context of Gaussian Boson Sampling (GBS) currently relies on providing evidence that the experimental samples genuinely follow their corresponding ground truth, i.e., the theoretical model of the experiment that includes all the possible losses that the experimenters can account for. This approach to verification has an important drawback: it is necessary to assume that the ground truth distributions are computationally hard to sample, that is, that they are sufficiently close to the distribution of the ideal, lossless experiment, for which there is evidence that sampling, either exactly or approximately, is a computationally hard task. This assumption, which cannot be easily confirmed, opens the door to classical algorithms that exploit the noise in the ground truth to efficiently simulate the experiments, thus undermining any quantum advantage claim. In this work, we argue that one can avoid this issue by validating GBS implementations using their corresponding ideal distributions directly. We explain how to use a modified version of the linear cross-entropy, a measure that we call the LXE score, to find reference values that help us assess how close a given GBS implementation is to its corresponding ideal model. Finally, we analytically compute the score that would be obtained by a lossless GBS implementation.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - NETS: A Non-Equilibrium Transport Sampler [15.58993313831079]
We propose an algorithm, termed the Non-Equilibrium Transport Sampler (NETS)
NETS can be viewed as a variant of importance sampling (AIS) based on Jarzynski's equality.
We show that this drift is the minimizer of a variety of objective functions, which can all be estimated in an unbiased fashion.
arXiv Detail & Related papers (2024-10-03T17:35:38Z) - Source-Free Unsupervised Domain Adaptation with Hypothesis Consolidation
of Prediction Rationale [53.152460508207184]
Source-Free Unsupervised Domain Adaptation (SFUDA) is a challenging task where a model needs to be adapted to a new domain without access to target domain labels or source domain data.
This paper proposes a novel approach that considers multiple prediction hypotheses for each sample and investigates the rationale behind each hypothesis.
To achieve the optimal performance, we propose a three-step adaptation process: model pre-adaptation, hypothesis consolidation, and semi-supervised learning.
arXiv Detail & Related papers (2024-02-02T05:53:22Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Gaussian boson sampling validation via detector binning [0.0]
We propose binned-detector probability distributions as a suitable quantity to statistically validate GBS experiments.
We show how to compute such distributions by leveraging their connection with their respective characteristic function.
We also illustrate how binned-detector probability distributions behave when Haar-averaged over all possible interferometric networks.
arXiv Detail & Related papers (2023-10-27T12:55:52Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - Importance sampling for stochastic quantum simulations [68.8204255655161]
We introduce the qDrift protocol, which builds random product formulas by sampling from the Hamiltonian according to the coefficients.
We show that the simulation cost can be reduced while achieving the same accuracy, by considering the individual simulation cost during the sampling stage.
Results are confirmed by numerical simulations performed on a lattice nuclear effective field theory.
arXiv Detail & Related papers (2022-12-12T15:06:32Z) - Validation tests of GBS quantum computers give evidence for quantum
advantage with a decoherent target [62.997667081978825]
We use positive-P phase-space simulations of grouped count probabilities as a fingerprint for verifying multi-mode data.
We show how one can disprove faked data, and apply this to a classical count algorithm.
arXiv Detail & Related papers (2022-11-07T12:00:45Z) - Simpler Certified Radius Maximization by Propagating Covariances [39.851641822878996]
We show an algorithm for maximizing the certified radius on datasets including Cifar-10, ImageNet, and Places365.
We show how satisfying these criteria yields an algorithm for maximizing the certified radius on datasets with moderate depth, with a small compromise in overall accuracy.
arXiv Detail & Related papers (2021-04-13T01:38:36Z) - Quantum Conformance Test [0.0]
We introduce a protocol addressing the conformance test problem, which consists in determining whether a process under test conforms to a reference one.
We formulate the problem in the context of hypothesis testing and consider the specific case in which the objects can be modeled as pure loss channels.
We experimentally implement this protocol, exploiting optical twin beams, validating our theoretical results.
arXiv Detail & Related papers (2020-12-30T18:53:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.