Gate Set Tomography
- URL: http://arxiv.org/abs/2009.07301v2
- Date: Tue, 28 Sep 2021 18:31:54 GMT
- Title: Gate Set Tomography
- Authors: Erik Nielsen, John King Gamble, Kenneth Rudinger, Travis Scholten,
Kevin Young, Robin Blume-Kohout
- Abstract summary: Gate set tomography ( GST) is a protocol for detailed, predictive characterization of logic operations (gates) on quantum computing processors.
This paper presents the foundations of GST in comprehensive detail.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gate set tomography (GST) is a protocol for detailed, predictive
characterization of logic operations (gates) on quantum computing processors.
Early versions of GST emerged around 2012-13, and since then it has been
refined, demonstrated, and used in a large number of experiments. This paper
presents the foundations of GST in comprehensive detail. The most important
feature of GST, compared to older state and process tomography protocols, is
that it is calibration-free. GST does not rely on pre-calibrated state
preparations and measurements. Instead, it characterizes all the operations in
a gate set simultaneously and self-consistently, relative to each other. Long
sequence GST can estimate gates with very high precision and efficiency,
achieving Heisenberg scaling in regimes of practical interest. In this paper,
we cover GST's intellectual history, the techniques and experiments used to
achieve its intended purpose, data analysis, gauge freedom and fixing, error
bars, and the interpretation of gauge-fixed estimates of gate sets. Our focus
is fundamental mathematical aspects of GST, rather than implementation details,
but we touch on some of the foundational algorithmic tricks used in the pyGSTi
implementation.
Related papers
- Microscopic parametrizations for gate set tomography under coloured noise [0.0]
We show that a microscopic parametrization of quantum gates under time-correlated noise on the driving phase reduces the required resources.
We discuss the minimal parametrizations of the gate set that include the effect of finite correlation times and non-Markovian quantum evolutions.
arXiv Detail & Related papers (2024-07-16T09:39:52Z) - Sparse is Enough in Fine-tuning Pre-trained Large Language Models [98.46493578509039]
We propose a gradient-based sparse fine-tuning algorithm, named Sparse Increment Fine-Tuning (SIFT)
We validate its effectiveness on a range of tasks including the GLUE Benchmark and Instruction-tuning.
arXiv Detail & Related papers (2023-12-19T06:06:30Z) - Near-Minimal Gate Set Tomography Experiment Designs [0.0]
We show how to streamline GST experiment designs by removing almost all redundancy.
We do this by analyzing the "germ" subroutines at the heart of GST circuits.
New experiment designs can match the precision of previous GST experiments with significantly fewer circuits.
arXiv Detail & Related papers (2023-08-17T04:46:25Z) - Two-Qubit Gate Set Tomography with Fewer Circuits [0.0]
We show how to exploit the structure of GST circuits to determine which ones are superfluous.
We also explore the impact of these techniques on the prospects of three-qubit GST.
arXiv Detail & Related papers (2023-07-28T18:52:34Z) - Learning Large Graph Property Prediction via Graph Segment Training [61.344814074335304]
We propose a general framework that allows learning large graph property prediction with a constant memory footprint.
We refine the GST paradigm by introducing a historical embedding table to efficiently obtain embeddings for segments not sampled for backpropagation.
Our experiments show that GST-EFD is both memory-efficient and fast, while offering a slight boost on test accuracy over a typical full graph training regime.
arXiv Detail & Related papers (2023-05-21T02:53:25Z) - From Gradient Flow on Population Loss to Learning with Stochastic
Gradient Descent [50.4531316289086]
Gradient Descent (SGD) has been the method of choice for learning large-scale non-root models.
An overarching paper is providing general conditions SGD converges, assuming that GF on the population loss converges.
We provide a unified analysis for GD/SGD not only for classical settings like convex losses, but also for more complex problems including Retrieval Matrix sq-root.
arXiv Detail & Related papers (2022-10-13T03:55:04Z) - Efficient characterization of qudit logical gates with gate set tomography using an error-free Virtual-Z-gate model [0.0]
We propose a more efficient GST approach for qudits, utilizing the qudit Hadamard and virtual Z gates to construct fiducials.
Our method reduces the computational costs of estimating characterization results, making GST more practical at scale.
arXiv Detail & Related papers (2022-10-10T17:20:25Z) - Tight Cram\'{e}r-Rao type bounds for multiparameter quantum metrology
through conic programming [61.98670278625053]
It is paramount to have practical measurement strategies that can estimate incompatible parameters with best precisions possible.
Here, we give a concrete way to find uncorrelated measurement strategies with optimal precisions.
We show numerically that there is a strict gap between the previous efficiently computable bounds and the ultimate precision bound.
arXiv Detail & Related papers (2022-09-12T13:06:48Z) - Learning Structures in Earth Observation Data with Gaussian Processes [67.27044745471207]
This paper reviews the main theoretical GP developments in the field.
New algorithms that respect the signal and noise characteristics, that provide feature rankings automatically, and that allow applicability of associated uncertainty intervals are discussed.
arXiv Detail & Related papers (2020-12-22T10:46:37Z) - Efficient and Stable Graph Scattering Transforms via Pruning [86.76336979318681]
Graph scattering transforms ( GSTs) offer training-free deep GCN models that extract features from graph data.
The price paid by GSTs is exponential complexity in space and time that increases with the number of layers.
The present work addresses the complexity limitation of GSTs by introducing an efficient so-termed pruned (p) GST approach.
arXiv Detail & Related papers (2020-01-27T16:05:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.