Error-correction and noise-decoherence thresholds for coherent errors in
planar-graph surface codes
- URL: http://arxiv.org/abs/2006.13055v1
- Date: Tue, 23 Jun 2020 14:29:25 GMT
- Title: Error-correction and noise-decoherence thresholds for coherent errors in
planar-graph surface codes
- Authors: F. Venn and B. B\'eri
- Abstract summary: We numerically study coherent errors in surface codes on planar graphs.
In particular, we show that a graph class exists where logical-level noise exhibits a decoherence threshold slightly above the error-correction threshold.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We numerically study coherent errors in surface codes on planar graphs,
focusing on noise of the form of $Z$- or $X$-rotations of individual qubits. We
find that, similarly to the case of incoherent bit- and phase-flips, a
trade-off between resilience against coherent $X$- and $Z$-rotations can be
made via the connectivity of the graph. However, our results indicate that,
unlike in the incoherent case, the error-correction thresholds for the various
graphs do not approach a universal bound. We also study the distribution of
final states after error correction. We show that graphs fall into three
distinct classes, each resulting in qualitatively distinct final-state
distributions. In particular, we show that a graph class exists where the
logical-level noise exhibits a decoherence threshold slightly above the
error-correction threshold. In these classes, therefore, the logical level
noise above the error-correction threshold can retain significant amount of
coherence even for large-distance codes. To perform our analysis, we develop a
Majorana-fermion representation of planar-graph surface codes and describe the
characterization of logical-state storage using fermion-linear-optics-based
simulations. We thereby generalize the approach introduced for the square
lattice by Bravyi \textit{et al}. [npj Quantum Inf. 4, 55 (2018)] to surface
codes on general planar graphs.
Related papers
- What Improves the Generalization of Graph Transformers? A Theoretical Dive into the Self-attention and Positional Encoding [67.59552859593985]
Graph Transformers, which incorporate self-attention and positional encoding, have emerged as a powerful architecture for various graph learning tasks.
This paper introduces first theoretical investigation of a shallow Graph Transformer for semi-supervised classification.
arXiv Detail & Related papers (2024-06-04T05:30:16Z) - Analysis of Corrected Graph Convolutions [10.991475575578855]
State-of-the-art machine learning models often use multiple graph convolutions on the data.
We show that too many graph convolutions can degrade performance significantly, a phenomenon known as oversmoothing.
We show that each round of convolution can reduce the misclassification error exponentially up to a saturation level.
arXiv Detail & Related papers (2024-05-22T20:50:17Z) - Graph Out-of-Distribution Generalization with Controllable Data
Augmentation [51.17476258673232]
Graph Neural Network (GNN) has demonstrated extraordinary performance in classifying graph properties.
Due to the selection bias of training and testing data, distribution deviation is widespread.
We propose OOD calibration to measure the distribution deviation of virtual samples.
arXiv Detail & Related papers (2023-08-16T13:10:27Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Joint graph learning from Gaussian observations in the presence of
hidden nodes [26.133725549667734]
We propose a joint graph learning method that takes into account the presence of hidden (latent) variables.
We exploit the structure resulting from the previous considerations to propose a convex optimization problem.
We compare the proposed algorithm with different baselines and evaluate its performance over synthetic and real-world graphs.
arXiv Detail & Related papers (2022-12-04T13:03:41Z) - Semi-Supervised Clustering of Sparse Graphs: Crossing the
Information-Theoretic Threshold [3.6052935394000234]
Block model is a canonical random graph model for clustering and community detection on network-structured data.
No estimator based on the network topology can perform substantially better than chance on sparse graphs if the model parameter is below a certain threshold.
We prove that with an arbitrary fraction of the labels feasible throughout the parameter domain.
arXiv Detail & Related papers (2022-05-24T00:03:25Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Graphon-aided Joint Estimation of Multiple Graphs [24.077455621015552]
We consider the problem of estimating the topology of multiple networks from nodal observations.
We adopt a graphon as our random graph model, which is a nonparametric model from which graphs of potentially different sizes can be drawn.
arXiv Detail & Related papers (2022-02-11T15:20:44Z) - Deep Graph-level Anomaly Detection by Glocal Knowledge Distillation [61.39364567221311]
Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes.
One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-anomalous graphs.
We introduce a novel deep anomaly detection approach for GAD that learns rich global and local normal pattern information by joint random distillation of graph and node representations.
arXiv Detail & Related papers (2021-12-19T05:04:53Z) - Spatial-spectral Hyperspectral Image Classification via Multiple Random
Anchor Graphs Ensemble Learning [88.60285937702304]
This paper proposes a novel spatial-spectral HSI classification method via multiple random anchor graphs ensemble learning (RAGE)
Firstly, the local binary pattern is adopted to extract the more descriptive features on each selected band, which preserves local structures and subtle changes of a region.
Secondly, the adaptive neighbors assignment is introduced in the construction of anchor graph, to reduce the computational complexity.
arXiv Detail & Related papers (2021-03-25T09:31:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.