Stability of classical shadows under gate-dependent noise
- URL: http://arxiv.org/abs/2310.19947v2
- Date: Wed, 13 Dec 2023 09:02:05 GMT
- Title: Stability of classical shadows under gate-dependent noise
- Authors: Raphael Brieger, Markus Heinrich, Ingo Roth, Martin Kliesch
- Abstract summary: In realistic practice, it is crucial to understand the behavior of shadow estimators under noise bases.
We identify average noise channels that affect shadow estimators and allow for a more fine-grained control of noise-induced biases.
- Score: 0.4997673761305335
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Expectation values of observables are routinely estimated using so-called
classical shadows$\unicode{x2014}$the outcomes of randomized bases measurements
on a repeatedly prepared quantum state. In order to trust the accuracy of
shadow estimation in practice, it is crucial to understand the behavior of the
estimators under realistic noise. In this work, we prove that any shadow
estimation protocol involving Clifford unitaries is stable under gate-dependent
noise for observables with bounded stabilizer norm$\unicode{x2014}$originally
introduced in the context of simulating Clifford circuits. For these
observables, we also show that the protocol's sample complexity is essentially
identical to the noiseless case. In contrast, we demonstrate that estimation of
`magic' observables can suffer from a bias that scales exponentially in the
system size. We further find that so-called robust shadows, aiming at
mitigating noise, can introduce a large bias in the presence of gate-dependent
noise compared to unmitigated classical shadows. Nevertheless, we guarantee the
functioning of robust shadows for a more general noise setting than in previous
works. On a technical level, we identify average noise channels that affect
shadow estimators and allow for a more fine-grained control of noise-induced
biases.
Related papers
- Noise-mitigated randomized measurements and self-calibrating shadow
estimation [0.0]
We introduce an error-mitigated method of randomized measurements, giving rise to a robust shadow estimation procedure.
On the practical side, we show that error mitigation and shadow estimation can be carried out using the same session of quantum experiments.
arXiv Detail & Related papers (2024-03-07T18:53:56Z) - Shadow tomography with noisy readouts [0.0]
Shadow tomography is a scalable technique to characterise the quantum state of a quantum computer or quantum simulator.
By construction, classical shadows are intrinsically sensitive to readout noise.
We show that classical shadows accept much more flexible constructions beyond the standard ones.
arXiv Detail & Related papers (2023-10-26T11:47:51Z) - Latent Class-Conditional Noise Model [54.56899309997246]
We introduce a Latent Class-Conditional Noise model (LCCN) to parameterize the noise transition under a Bayesian framework.
We then deduce a dynamic label regression method for LCCN, whose Gibbs sampler allows us efficiently infer the latent true labels.
Our approach safeguards the stable update of the noise transition, which avoids previous arbitrarily tuning from a mini-batch of samples.
arXiv Detail & Related papers (2023-02-19T15:24:37Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - The Optimal Noise in Noise-Contrastive Learning Is Not What You Think [80.07065346699005]
We show that deviating from this assumption can actually lead to better statistical estimators.
In particular, the optimal noise distribution is different from the data's and even from a different family.
arXiv Detail & Related papers (2022-03-02T13:59:20Z) - Analyzing the impact of time-correlated noise on zero-noise
extrapolation [0.879504058268139]
We investigate the feasibility and performance of zero-noise extrapolation in the presence of time-correlated noise.
Gate Trotterization is a new noise scaling technique that may be of independent interest.
arXiv Detail & Related papers (2022-01-27T20:17:22Z) - Analyzing and Improving the Optimization Landscape of Noise-Contrastive
Estimation [50.85788484752612]
Noise-contrastive estimation (NCE) is a statistically consistent method for learning unnormalized probabilistic models.
It has been empirically observed that the choice of the noise distribution is crucial for NCE's performance.
In this work, we formally pinpoint reasons for NCE's poor performance when an inappropriate noise distribution is used.
arXiv Detail & Related papers (2021-10-21T16:57:45Z) - Classical Shadows With Noise [0.0]
We study the effects of noise on the classical shadows protocol.
We derive an analytical upper bound for the sample complexity in terms of a shadow seminorm for both local and global noise.
Our results can be used to prove rigorous sample complexity upper bounds in the cases of depolarizing noise and amplitude damping.
arXiv Detail & Related papers (2020-11-23T17:43:42Z) - Robust shadow estimation [1.7205106391379026]
We show how to mitigate errors in the shadow estimation protocol recently proposed by Huang, Kueng, and Preskill.
By adding an experimentally friendly calibration stage to the standard shadow estimation scheme, our robust shadow estimation algorithm can obtain an unbiased estimate of the classical shadow of a quantum system.
arXiv Detail & Related papers (2020-11-19T03:46:49Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z) - Contextual Linear Bandits under Noisy Features: Towards Bayesian Oracles [65.9694455739978]
We study contextual linear bandit problems under feature uncertainty, where the features are noisy and have missing entries.
Our analysis reveals that the optimal hypothesis can significantly deviate from the underlying realizability function, depending on the noise characteristics.
This implies that classical approaches cannot guarantee a non-trivial regret bound.
arXiv Detail & Related papers (2017-03-03T21:39:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.