Unveiling the nonclassicality within quasi-distribution representations through deep learning
- URL: http://arxiv.org/abs/2312.16055v2
- Date: Fri, 01 Nov 2024 15:05:28 GMT
- Title: Unveiling the nonclassicality within quasi-distribution representations through deep learning
- Authors: Hong-Bin Chen, Cheng-Hua Liu, Kuan-Lun Lai, Bor-Yann Tseng, Ping-Yuan Lo, Yueh-Nan Chen, Chi-Hua Yu,
- Abstract summary: A widely adopted approach focuses on the negative values of a quasi-distribution representation as compelling evidence of nonclassicality.
Here we propose a computational approach utilizing a deep generative model, processing three marginals, to construct the joint quasi-distribution functions.
Our approach also provides a significant reduction of the experimental efforts of constructing the Wigner functions of quantum states.
- Score: 1.130790932059036
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: To unequivocally distinguish genuine quantumness from classicality, a widely adopted approach focuses on the negative values of a quasi-distribution representation as compelling evidence of nonclassicality. Prominent examples include the dynamical process nonclassicality characterized by the canonical Hamiltonian ensemble representation (CHER) and the nonclassicality of quantum states characterized by the Wigner function. However, to construct a multivariate joint quasi-distribution function with negative values from experimental data is typically highly cumbersome. Here we propose a computational approach utilizing a deep generative model, processing three marginals, to construct the bivariate joint quasi-distribution functions. We first apply our model to tackle the challenging problem of the CHERs, which lacks universal solutions, rendering the problem ground-truth (GT) deficient. To overcome the GT deficiency of the CHER problem, we design optimal synthetic datasets to train our model. While trained with synthetic data, the physics-informed optimization enables our model to capture the detrimental effect of the thermal fluctuations on nonclassicality, which cannot be obtained from any analytical solutions. This underscores the reliability of our approach. This approach also allows us to predict the Wigner functions subject to thermal noises. Our model predicts the Wigner functions with a prominent accuracy by processing three marginals of probability distributions. Our approach also provides a significant reduction of the experimental efforts of constructing the Wigner functions of quantum states, giving rise to an efficient alternative way to realize the quantum state tomography.
Related papers
- Overcoming Dimensional Factorization Limits in Discrete Diffusion Models through Quantum Joint Distribution Learning [79.65014491424151]
We propose a quantum Discrete Denoising Diffusion Probabilistic Model (QD3PM)<n>It enables joint probability learning through diffusion and denoising in exponentially large Hilbert spaces.<n>This paper establishes a new theoretical paradigm in generative models by leveraging the quantum advantage in joint distribution learning.
arXiv Detail & Related papers (2025-05-08T11:48:21Z) - Scaling and renormalization in high-dimensional regression [72.59731158970894]
We present a unifying perspective on recent results on ridge regression.<n>We use the basic tools of random matrix theory and free probability, aimed at readers with backgrounds in physics and deep learning.<n>Our results extend and provide a unifying perspective on earlier models of scaling laws.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Conditional Pseudo-Reversible Normalizing Flow for Surrogate Modeling in Quantifying Uncertainty Propagation [11.874729463016227]
We introduce a conditional pseudo-reversible normalizing flow for constructing surrogate models of a physical model polluted by additive noise.
The training process utilizes dataset consisting of input-output pairs without requiring prior knowledge about the noise and the function.
Our model, once trained, can generate samples from any conditional probability density functions whose high probability regions are covered by the training set.
arXiv Detail & Related papers (2024-03-31T00:09:58Z) - A PAC-Bayesian Perspective on the Interpolating Information Criterion [54.548058449535155]
We show how a PAC-Bayes bound is obtained for a general class of models, characterizing factors which influence performance in the interpolating regime.
We quantify how the test error for overparameterized models achieving effectively zero training error depends on the quality of the implicit regularization imposed by e.g. the combination of model, parameter-initialization scheme.
arXiv Detail & Related papers (2023-11-13T01:48:08Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - No-Regret Constrained Bayesian Optimization of Noisy and Expensive
Hybrid Models using Differentiable Quantile Function Approximations [0.0]
Constrained Upper Quantile Bound (CUQB) is a conceptually simple, deterministic approach that avoids constraint approximations.
We show that CUQB significantly outperforms traditional Bayesian optimization in both constrained and unconstrained cases.
arXiv Detail & Related papers (2023-05-05T19:57:36Z) - On Quantum Circuits for Discrete Graphical Models [1.0965065178451106]
We provide the first method that allows one to provably generate unbiased and independent samples from general discrete factor models.
Our method is compatible with multi-body interactions and its success probability does not depend on the number of variables.
Experiments with quantum simulation as well as actual quantum hardware show that our method can carry out sampling and parameter learning on quantum computers.
arXiv Detail & Related papers (2022-06-01T11:03:51Z) - Generalization Metrics for Practical Quantum Advantage in Generative
Models [68.8204255655161]
Generative modeling is a widely accepted natural use case for quantum computers.
We construct a simple and unambiguous approach to probe practical quantum advantage for generative modeling by measuring the algorithm's generalization performance.
Our simulation results show that our quantum-inspired models have up to a $68 times$ enhancement in generating unseen unique and valid samples.
arXiv Detail & Related papers (2022-01-21T16:35:35Z) - Generalization of Neural Combinatorial Solvers Through the Lens of
Adversarial Robustness [68.97830259849086]
Most datasets only capture a simpler subproblem and likely suffer from spurious features.
We study adversarial robustness - a local generalization property - to reveal hard, model-specific instances and spurious features.
Unlike in other applications, where perturbation models are designed around subjective notions of imperceptibility, our perturbation models are efficient and sound.
Surprisingly, with such perturbations, a sufficiently expressive neural solver does not suffer from the limitations of the accuracy-robustness trade-off common in supervised learning.
arXiv Detail & Related papers (2021-10-21T07:28:11Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z) - Physics-constrained Bayesian inference of state functions in classical
density-functional theory [0.6445605125467573]
We develop a novel data-driven approach to the inverse problem of classical statistical mechanics.
We develop an efficient learning algorithm which characterises the construction of approximate free energy functionals.
We consider excluded volume particle interactions, which are ubiquitous in nature, whilst being highly challenging for modelling in terms of free energy.
arXiv Detail & Related papers (2020-10-07T12:43:42Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.