qecGPT: decoding Quantum Error-correcting Codes with Generative
Pre-trained Transformers
- URL: http://arxiv.org/abs/2307.09025v1
- Date: Tue, 18 Jul 2023 07:34:02 GMT
- Title: qecGPT: decoding Quantum Error-correcting Codes with Generative
Pre-trained Transformers
- Authors: Hanyan Cao, Feng Pan, Yijia Wang, Pan Zhang
- Abstract summary: We propose a framework for decoding quantum error-correcting codes with generative modeling.
We use autoregressive neural networks, specifically Transformers, to learn the joint probability of logical operators and syndromes.
Our framework is general and can be applied to any error model and quantum codes with different topologies.
- Score: 5.392298820599664
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a general framework for decoding quantum error-correcting codes
with generative modeling. The model utilizes autoregressive neural networks,
specifically Transformers, to learn the joint probability of logical operators
and syndromes. This training is in an unsupervised way, without the need for
labeled training data, and is thus referred to as pre-training. After the
pre-training, the model can efficiently compute the likelihood of logical
operators for any given syndrome, using maximum likelihood decoding. It can
directly generate the most-likely logical operators with computational
complexity $\mathcal O(2k)$ in the number of logical qubits $k$, which is
significantly better than the conventional maximum likelihood decoding
algorithms that require $\mathcal O(4^k)$ computation. Based on the pre-trained
model, we further propose refinement to achieve more accurately the likelihood
of logical operators for a given syndrome by directly sampling the stabilizer
operators. We perform numerical experiments on stabilizer codes with small code
distances, using both depolarizing error models and error models with
correlated noise. The results show that our approach provides significantly
better decoding accuracy than the minimum weight perfect matching and
belief-propagation-based algorithms. Our framework is general and can be
applied to any error model and quantum codes with different topologies such as
surface codes and quantum LDPC codes. Furthermore, it leverages the
parallelization capabilities of GPUs, enabling simultaneous decoding of a large
number of syndromes. Our approach sheds light on the efficient and accurate
decoding of quantum error-correcting codes using generative artificial
intelligence and modern computational power.
Related papers
- Algorithmic Fault Tolerance for Fast Quantum Computing [37.448838730002905]
We show that fault-tolerant logical operations can be performed with constant time overhead for a broad class of quantum codes.
We prove that the deviation from the ideal measurement result distribution can be made exponentially small in the code distance.
Our work sheds new light on the theory of fault tolerance, potentially reducing the space-time cost of practical fault-tolerant quantum computation by orders of magnitude.
arXiv Detail & Related papers (2024-06-25T15:43:25Z) - Transformer-QEC: Quantum Error Correction Code Decoding with
Transferable Transformers [18.116657629047253]
We introduce a transformer-based Quantum Error Correction (QEC) decoder.
It employs self-attention to achieve a global receptive field across all input syndromes.
It incorporates a mixed loss training approach, combining both local physical error and global parity label losses.
arXiv Detail & Related papers (2023-11-27T18:52:25Z) - Testing the Accuracy of Surface Code Decoders [55.616364225463066]
Large-scale, fault-tolerant quantum computations will be enabled by quantum error-correcting codes (QECC)
This work presents the first systematic technique to test the accuracy and effectiveness of different QECC decoding schemes.
arXiv Detail & Related papers (2023-11-21T10:22:08Z) - CORE: Common Random Reconstruction for Distributed Optimization with
Provable Low Communication Complexity [110.50364486645852]
Communication complexity has become a major bottleneck for speeding up training and scaling up machine numbers.
We propose Common Om REOm, which can be used to compress information transmitted between machines.
arXiv Detail & Related papers (2023-09-23T08:45:27Z) - Data-driven decoding of quantum error correcting codes using graph
neural networks [0.0]
We explore a model-free, data-driven, approach to decoding, using a graph neural network (GNN)
We show that the GNN-based decoder can outperform a matching decoder for circuit level noise on the surface code given only simulated data.
The results show that a purely data-driven approach to decoding may be a viable future option for practical quantum error correction.
arXiv Detail & Related papers (2023-07-03T17:25:45Z) - The END: An Equivariant Neural Decoder for Quantum Error Correction [73.4384623973809]
We introduce a data efficient neural decoder that exploits the symmetries of the problem.
We propose a novel equivariant architecture that achieves state of the art accuracy compared to previous neural decoders.
arXiv Detail & Related papers (2023-04-14T19:46:39Z) - Deep Quantum Error Correction [73.54643419792453]
Quantum error correction codes (QECC) are a key component for realizing the potential of quantum computing.
In this work, we efficiently train novel emphend-to-end deep quantum error decoders.
The proposed method demonstrates the power of neural decoders for QECC by achieving state-of-the-art accuracy.
arXiv Detail & Related papers (2023-01-27T08:16:26Z) - A Stable, Fast, and Fully Automatic Learning Algorithm for Predictive
Coding Networks [65.34977803841007]
Predictive coding networks are neuroscience-inspired models with roots in both Bayesian statistics and neuroscience.
We show how by simply changing the temporal scheduling of the update rule for the synaptic weights leads to an algorithm that is much more efficient and stable than the original one.
arXiv Detail & Related papers (2022-11-16T00:11:04Z) - Quantum Sparse Coding [5.130440339897477]
We develop a quantum-inspired algorithm for sparse coding.
The emergence of quantum computers and Ising machines can potentially lead to more accurate estimations.
We conduct numerical experiments with simulated data on Lightr's quantum-inspired digital platform.
arXiv Detail & Related papers (2022-09-08T13:00:30Z) - Predictive Coding Approximates Backprop along Arbitrary Computation
Graphs [68.8204255655161]
We develop a strategy to translate core machine learning architectures into their predictive coding equivalents.
Our models perform equivalently to backprop on challenging machine learning benchmarks.
Our method raises the potential that standard machine learning algorithms could in principle be directly implemented in neural circuitry.
arXiv Detail & Related papers (2020-06-07T15:35:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.