The Second Moment of Hafnians in Gaussian Boson Sampling
- URL: http://arxiv.org/abs/2403.13878v1
- Date: Wed, 20 Mar 2024 18:00:00 GMT
- Title: The Second Moment of Hafnians in Gaussian Boson Sampling
- Authors: Adam Ehrenberg, Joseph T. Iosue, Abhinav Deshpande, Dominik Hangleiter, Alexey V. Gorshkov,
- Abstract summary: Anticoncentration is a second-moment property of the output probabilities.
We develop a graph-theoretic method to study these moments and use it to identify a transition in anticoncentration.
These results allow us to pinpoint the transition in anticoncentration and furthermore yield the expected linear cross-entropy benchmarking score for an ideal (error-free) device.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian Boson Sampling is a popular method for experimental demonstrations of quantum advantage, but many subtleties remain in fully understanding its theoretical underpinnings. An important component in the theoretical arguments for approximate average-case hardness of sampling is anticoncentration, which is a second-moment property of the output probabilities. In Gaussian Boson Sampling these are given by hafnians of generalized circular orthogonal ensemble matrices. In a companion work [arXiv:2312.08433], we develop a graph-theoretic method to study these moments and use it to identify a transition in anticoncentration. In this work, we find a recursive expression for the second moment using these graph-theoretic techniques. While we have not been able to solve this recursion by hand, we are able to solve it numerically exactly, which we do up to Fock sector $2n = 80$. We further derive new analytical results about the second moment. These results allow us to pinpoint the transition in anticoncentration and furthermore yield the expected linear cross-entropy benchmarking score for an ideal (error-free) device.
Related papers
- Straightness of Rectified Flow: A Theoretical Insight into Wasserstein Convergence [54.580605276017096]
Diffusion models have emerged as a powerful tool for image generation and denoising.
Recently, Liu et al. designed a novel alternative generative model Rectified Flow (RF)
RF aims to learn straight flow trajectories from noise to data using a sequence of convex optimization problems.
arXiv Detail & Related papers (2024-10-19T02:36:11Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - On randomized estimators of the Hafnian of a nonnegative matrix [0.0]
Gaussian Boson samplers aim to demonstrate quantum advantage by performing a sampling task believed to be classically hard.
For nonnegative matrices, there is a family of randomized estimators of the Hafnian based on generating a particular random matrix and calculating its determinant.
Here we investigate the performance of two such estimators, which we call the Barvinok and Godsil-Gutman estimators.
arXiv Detail & Related papers (2023-12-15T19:00:07Z) - Transition of Anticoncentration in Gaussian Boson Sampling [0.0]
We develop a graph-theoretic framework for analyzing the moments of the Gaussian Boson Sampling distribution.
We show that when the number of initially squeezed modes scales sufficiently slowly with the number of photons, there is a lack of anticoncentration.
arXiv Detail & Related papers (2023-12-13T19:00:00Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - On the Benefits of Large Learning Rates for Kernel Methods [110.03020563291788]
We show that a phenomenon can be precisely characterized in the context of kernel methods.
We consider the minimization of a quadratic objective in a separable Hilbert space, and show that with early stopping, the choice of learning rate influences the spectral decomposition of the obtained solution.
arXiv Detail & Related papers (2022-02-28T13:01:04Z) - Certification of Gaussian Boson Sampling via graph theory [4.063872661554895]
We exploit a connection between photon counting of a genuine Gaussian Boson Sampling device and the number of perfect matchings in a graph.
Within this framework, two approaches that exploit the distributions of graph feature vectors and graph kernels are presented.
arXiv Detail & Related papers (2022-02-15T20:22:28Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Optical estimation of unitary Gaussian processes without phase reference
using Fock states [0.9786690381850356]
We consider two single-mode Gaussian processes, displacement and squeezing.
We show that these two can be efficiently estimated using photon number states and photon number resolving detectors.
arXiv Detail & Related papers (2020-06-17T16:40:21Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.