Randomised benchmarking for characterizing and forecasting correlated
processes
- URL: http://arxiv.org/abs/2312.06062v1
- Date: Mon, 11 Dec 2023 01:55:44 GMT
- Title: Randomised benchmarking for characterizing and forecasting correlated
processes
- Authors: Xinfang Zhang, Zhihao Wu, Gregory A. L. White, Zhongcheng Xiang, Shun
Hu, Zhihui Peng, Yong Liu, Dongning Zheng, Xiang Fu, Anqi Huang, Dario
Poletti, Kavan Modi, Junjie Wu, Mingtang Deng, Chu Guo
- Abstract summary: We develop a method to learn the details of temporally correlated noise.
In particular, we can learn the time-independent evolution operator of system plus bath.
We exemplify this by implementing our method on a superconducting quantum processor.
- Score: 8.788375252357945
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The development of fault-tolerant quantum processors relies on the ability to
control noise. A particularly insidious form of noise is temporally correlated
or non-Markovian noise. By combining randomized benchmarking with supervised
machine learning algorithms, we develop a method to learn the details of
temporally correlated noise. In particular, we can learn the time-independent
evolution operator of system plus bath and this leads to (i) the ability to
characterize the degree of non-Markovianity of the dynamics and (ii) the
ability to predict the dynamics of the system even beyond the times we have
used to train our model. We exemplify this by implementing our method on a
superconducting quantum processor. Our experimental results show a drastic
change between the Markovian and non-Markovian regimes for the learning
accuracies.
Related papers
- Stochastic action for the entanglement of a noisy monitored two-qubit
system [55.2480439325792]
We study the effect of local unitary noise on the entanglement evolution of a two-qubit system subject to local monitoring and inter-qubit coupling.
We construct a Hamiltonian by incorporating the noise into the Chantasri-Dressel-Jordan path integral and use it to identify the optimal entanglement dynamics.
Numerical investigation of long-time steady-state entanglement reveals a non-monotonic relationship between concurrence and noise strength.
arXiv Detail & Related papers (2024-03-13T11:14:10Z) - Learning noise-induced transitions by multi-scaling reservoir computing [2.9170682727903863]
We develop a machine learning model, reservoir computing as a type of recurrent neural network, to learn noise-induced transitions.
The trained model generates accurate statistics of transition time and the number of transitions.
It is also aware of the asymmetry of the double-well potential, the rotational dynamics caused by non-detailed balance, and transitions in multi-stable systems.
arXiv Detail & Related papers (2023-09-11T12:26:36Z) - Provable Guarantees for Generative Behavior Cloning: Bridging Low-Level
Stability and High-Level Behavior [51.60683890503293]
We propose a theoretical framework for studying behavior cloning of complex expert demonstrations using generative modeling.
We show that pure supervised cloning can generate trajectories matching the per-time step distribution of arbitrary expert trajectories.
arXiv Detail & Related papers (2023-07-27T04:27:26Z) - Robustness of quantum reinforcement learning under hardware errors [0.0]
Variational quantum machine learning algorithms have become the focus of recent research on how to utilize near-term quantum devices for machine learning tasks.
They are considered suitable for this as the circuits that are run can be tailored to the device, and a big part of the computation is delegated to the classical.
However, the effect of training quantum machine learning models under the influence of hardware-induced noise has not yet been extensively studied.
arXiv Detail & Related papers (2022-12-19T13:14:22Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Characterizing low-frequency qubit noise [55.41644538483948]
Fluctuations of the qubit frequencies are one of the major problems to overcome on the way to scalable quantum computers.
The statistics of the fluctuations can be characterized by measuring the correlators of the outcomes of periodically repeated Ramsey measurements.
This work suggests a method that allows describing qubit dynamics during repeated measurements in the presence of evolving noise.
arXiv Detail & Related papers (2022-07-04T22:48:43Z) - DriPP: Driven Point Processes to Model Stimuli Induced Patterns in M/EEG
Signals [62.997667081978825]
We develop a novel statistical point process model-called driven temporal point processes (DriPP)
We derive a fast and principled expectation-maximization (EM) algorithm to estimate the parameters of this model.
Results on standard MEG datasets demonstrate that our methodology reveals event-related neural responses.
arXiv Detail & Related papers (2021-12-08T13:07:21Z) - Experimental characterisation of a non-Markovian quantum process [0.0]
We employ machine learning models to estimate the amount of non-Markovianity.
We are able to predict the non-Markovianity measure with $90%$ accuracy.
Our experiment paves the way for efficient detection of non-Markovian noise appearing in large scale quantum computers.
arXiv Detail & Related papers (2021-02-02T06:00:04Z) - Automatic Differentiation to Simultaneously Identify Nonlinear Dynamics
and Extract Noise Probability Distributions from Data [4.996878640124385]
SINDy is a framework for the discovery of parsimonious dynamic models and equations from time-series data.
We develop a variant of the SINDy algorithm that integrates automatic differentiation and recent time-stepping constrained by Rudy et al.
We show the method can identify a diversity of probability distributions including Gaussian, uniform, Gamma, and Rayleigh.
arXiv Detail & Related papers (2020-09-12T23:52:25Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.