On the Hardness of PAC-learning stabilizer States with Noise
- URL: http://arxiv.org/abs/2102.05174v1
- Date: Tue, 9 Feb 2021 23:06:54 GMT
- Title: On the Hardness of PAC-learning stabilizer States with Noise
- Authors: Aravind Gollakota and Daniel Liang
- Abstract summary: We consider the problem of learning stabilizer states with noise in the Probably Approximately Correct (PAC) framework of Aaronson (2007) for learning quantum states.
Motivated by approaches to noise tolerance from classical learning theory, we introduce the Statistical Query (SQ) model for PAC-learning quantum states.
We prove that algorithms in this model are indeed resilient to common forms of noise, including classification and depolarizing noise.
- Score: 5.584060970507507
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the problem of learning stabilizer states with noise in the
Probably Approximately Correct (PAC) framework of Aaronson (2007) for learning
quantum states. In the noiseless setting, an algorithm for this problem was
recently given by Rocchetto (2018), but the noisy case was left open. Motivated
by approaches to noise tolerance from classical learning theory, we introduce
the Statistical Query (SQ) model for PAC-learning quantum states, and prove
that algorithms in this model are indeed resilient to common forms of noise,
including classification and depolarizing noise. We prove an exponential lower
bound on learning stabilizer states in the SQ model. Even outside the SQ model,
we prove that learning stabilizer states with noise is in general as hard as
Learning Parity with Noise (LPN) using classical examples. Our results position
the problem of learning stabilizer states as a natural quantum analogue of the
classical problem of learning parities: easy in the noiseless setting, but
seemingly intractable even with simple forms of noise.
Related papers
- Fidelity-Enhanced Variational Quantum Optimal Control [0.0]
We propose a new method for creating robust pulses based on the Schr"odinger equation.
By accounting for both environmental noise sources as well as noise sources inherent to the control system, highly significant increases in fidelity are noted for both single and multiqubit state preparations.
arXiv Detail & Related papers (2025-01-29T14:59:34Z) - The Learning Stabilizers with Noise problem [46.623273455512106]
We show that the Learning Parity with Noise (LPN) problem can be seen as the task of decoding a random linear code in the presence of noise.
We show that LSN includes as a special case, which suggests that it is at least as hard as its classical counterpart.
We identify several applications of our LSN assumption, ranging from the construction of quantum bit schemes to the computational limitations of learning from quantum data.
arXiv Detail & Related papers (2024-10-24T17:53:02Z) - Back to Basics: Fast Denoising Iterative Algorithm [0.0]
We introduce Back to Basics (BTB), a fast iterative algorithm for noise reduction.
We examine three study cases: natural image denoising in the presence of additive white Gaussian noise, Poisson-distributed image denoising, and speckle suppression in optical coherence tomography ( OCT)
Experimental results demonstrate that the proposed approach can effectively improve image quality, in challenging noise settings.
arXiv Detail & Related papers (2023-11-11T18:32:06Z) - Latent Class-Conditional Noise Model [54.56899309997246]
We introduce a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.
We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels.
Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples.
arXiv Detail & Related papers (2023-02-19T15:24:37Z) - Quantum state-preparation control in noisy environment via most-likely
paths [1.9260081982051918]
We consider an alternative view of a noise-affected open quantum system, where the average dynamics can be unravelled into hypothetical noisy trajectories.
We adopt the most-likely path technique for quantum state-preparation, constructing a path for noise variables and finding control functions.
As a proof of concept, we apply the method to a qubit-state preparation under a dephasing noise and analytically solve for controlled Rabi drives for arbitrary target states.
arXiv Detail & Related papers (2022-09-27T05:42:09Z) - Identifying Hard Noise in Long-Tailed Sample Distribution [76.16113794808001]
We introduce Noisy Long-Tailed Classification (NLT)
Most de-noising methods fail to identify the hard noises.
We design an iterative noisy learning framework called Hard-to-Easy (H2E)
arXiv Detail & Related papers (2022-07-27T09:03:03Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - Analyzing and Improving the Optimization Landscape of Noise-Contrastive
Estimation [50.85788484752612]
Noise-contrastive estimation (NCE) is a statistically consistent method for learning unnormalized probabilistic models.
It has been empirically observed that the choice of the noise distribution is crucial for NCE's performance.
In this work, we formally pinpoint reasons for NCE's poor performance when an inappropriate noise distribution is used.
arXiv Detail & Related papers (2021-10-21T16:57:45Z) - Robust Learning under Strong Noise via SQs [5.9256596453465225]
We show that every SQ learnable class admits an efficient learning algorithm with OPT + $epsilon misilon misclassification error for a broad class of noise models.
This setting substantially generalizes the widely-studied problem classification under RCN with known noise probabilities.
arXiv Detail & Related papers (2020-10-18T21:02:26Z) - Noise-Induced Barren Plateaus in Variational Quantum Algorithms [0.3562485774739681]
Variational Quantum Algorithms (VQAs) may be a path to quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) computers.
We rigorously prove a serious limitation for noisy VQAs, in that the noise causes the training landscape to have a barren plateau.
arXiv Detail & Related papers (2020-07-28T17:52:21Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.