General tensor network decoding of 2D Pauli codes
- URL: http://arxiv.org/abs/2101.04125v3
- Date: Wed, 13 Oct 2021 17:00:26 GMT
- Title: General tensor network decoding of 2D Pauli codes
- Authors: Christopher T. Chubb
- Abstract summary: We propose a decoder that approximates maximally likelihood decoding for 2D stabiliser and subsystem codes subject to Pauli noise.
We numerically demonstrate the power of this decoder by studying four classes of codes under three noise models.
We show that the thresholds yielded by our decoder are state-of-the-art, and numerically consistent with optimal thresholds where available.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work we develop a general tensor network decoder for 2D codes.
Specifically, we propose a decoder that approximates maximally likelihood
decoding for 2D stabiliser and subsystem codes subject to Pauli noise. For a
code consisting of $n$ qubits our decoder has a runtime of $O(n\log
n+n\chi^3)$, where $\chi$ is an approximation parameter. We numerically
demonstrate the power of this decoder by studying four classes of codes under
three noise models, namely regular surface codes, irregular surface codes,
subsystem surface codes and colour codes, under bit-flip, phase-flip and
depolarising noise. We show that the thresholds yielded by our decoder are
state-of-the-art, and numerically consistent with optimal thresholds where
available, suggesting that the tensor network decoder well approximates optimal
decoding in all these cases. Novel to our decoder is an efficient and effective
approximate contraction scheme for arbitrary 2D tensor networks, which may be
of independent interest. We have also released an implementation of this
algorithm as a stand-alone Julia package: SweepContractor.jl.
Related papers
- Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - Bit-flipping Decoder Failure Rate Estimation for (v,w)-regular Codes [84.0257274213152]
We propose a new technique to provide accurate estimates of the DFR of a two-iterations (parallel) bit flipping decoder.
We validate our results, providing comparisons of the modeled and simulated weight of the syndrome, incorrectly-guessed error bit distribution at the end of the first iteration, and two-itcrypteration Decoding Failure Rates (DFR)
arXiv Detail & Related papers (2024-01-30T11:40:24Z) - Tensor Network Decoding Beyond 2D [2.048226951354646]
We introduce several techniques to generalize tensor network decoding to higher dimensions.
We numerically demonstrate that the decoding accuracy of our approach outperforms state-of-the-art decoders on the 3D surface code.
arXiv Detail & Related papers (2023-10-16T18:00:02Z) - Machine Learning-Aided Efficient Decoding of Reed-Muller Subcodes [59.55193427277134]
Reed-Muller (RM) codes achieve the capacity of general binary-input memoryless symmetric channels.
RM codes only admit limited sets of rates.
Efficient decoders are available for RM codes at finite lengths.
arXiv Detail & Related papers (2023-01-16T04:11:14Z) - Dense Coding with Locality Restriction for Decoder: Quantum Encoders vs.
Super-Quantum Encoders [67.12391801199688]
We investigate dense coding by imposing various locality restrictions to our decoder.
In this task, the sender Alice and the receiver Bob share an entangled state.
arXiv Detail & Related papers (2021-09-26T07:29:54Z) - Local tensor-network codes [0.0]
We show how to write some topological codes, including the surface code and colour code, as simple tensor-network codes.
We prove that this method is efficient in the case of holographic codes.
arXiv Detail & Related papers (2021-09-24T14:38:06Z) - KO codes: Inventing Nonlinear Encoding and Decoding for Reliable
Wireless Communication via Deep-learning [76.5589486928387]
Landmark codes underpin reliable physical layer communication, e.g., Reed-Muller, BCH, Convolution, Turbo, LDPC and Polar codes.
In this paper, we construct KO codes, a computationaly efficient family of deep-learning driven (encoder, decoder) pairs.
KO codes beat state-of-the-art Reed-Muller and Polar codes, under the low-complexity successive cancellation decoding.
arXiv Detail & Related papers (2021-08-29T21:08:30Z) - Trellis Decoding For Qudit Stabilizer Codes And Its Application To Qubit
Topological Codes [3.9962751777898955]
We show that trellis decoders have strong structure, extend the results using classical coding theory as a guide, and demonstrate a canonical form from which the structural properties of the decoding graph may be computed.
The modified decoder works for any stabilizer code $S$ and separates into two parts: a one-time, offline which builds a compact, graphical representation of the normalizer of the code, $Sperp$, and a quick, parallel, online computation using the Viterbi algorithm.
arXiv Detail & Related papers (2021-06-15T16:01:42Z) - Tensor-network codes [0.0]
We introduce tensor-network stabilizer codes which come with a natural tensor-network decoder.
We generalize holographic codes beyond those constructed from perfect or block-perfect isometries.
For holographic codes exact the tensor-network decoder is efficient with a complexity that is in the number of physical qubits.
arXiv Detail & Related papers (2020-09-22T05:44:50Z) - Pruning Neural Belief Propagation Decoders [77.237958592189]
We introduce a method to tailor an overcomplete parity-check matrix to (neural) BP decoding using machine learning.
We achieve performance within 0.27 dB and 1.5 dB of the ML performance while reducing the complexity of the decoder.
arXiv Detail & Related papers (2020-01-21T12:05:46Z) - Deep Q-learning decoder for depolarizing noise on the toric code [0.0]
We present an AI-based decoding agent for quantum error correction of depolarizing noise on the toric code.
The agent is trained using deep reinforcement learning (DRL), where an artificial neural network encodes the state-action Q-values of error-correcting $X$, $Y$, and $Z$ Pauli operations.
We argue that the DRL-type decoder provides a promising framework for future practical error correction of topological codes.
arXiv Detail & Related papers (2019-12-30T13:27:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.