Refined Belief Propagation Decoding of Sparse-Graph Quantum Codes
- URL: http://arxiv.org/abs/2002.06502v2
- Date: Wed, 22 Jul 2020 10:25:26 GMT
- Title: Refined Belief Propagation Decoding of Sparse-Graph Quantum Codes
- Authors: Kao-Yueh Kuo and Ching-Yi Lai
- Abstract summary: We propose a refined BP decoding algorithm for quantum codes with complexity roughly the same as binary BP.
For a given error syndrome, this algorithm decodes to the same output as the conventional quaternary BP, but the passed node-to-node messages are single-valued.
Message strength normalization can naturally be applied to these single-valued messages to improve the performance.
- Score: 4.340338299803562
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum stabilizer codes constructed from sparse matrices have good
performance and can be efficiently decoded by belief propagation (BP). A
conventional BP decoding algorithm treats binary stabilizer codes as additive
codes over GF(4). This algorithm has a relatively complex process of handling
check-node messages, which incurs higher decoding complexity. Moreover, BP
decoding of a stabilizer code usually suffers a performance loss due to the
many short cycles in the underlying Tanner graph. In this paper, we propose a
refined BP decoding algorithm for quantum codes with complexity roughly the
same as binary BP. For a given error syndrome, this algorithm decodes to the
same output as the conventional quaternary BP but the passed node-to-node
messages are single-valued, unlike the quaternary BP, where multivalued
node-to-node messages are required. Furthermore, the techniques of message
strength normalization can naturally be applied to these single-valued messages
to improve the performance. Another observation is that the message-update
schedule affects the performance of BP decoding against short cycles. We show
that running BP with message strength normalization according to a serial
schedule (or other schedules) may significantly improve the decoding
performance and error floor in computer simulation.
Related papers
- An almost-linear time decoding algorithm for quantum LDPC codes under circuit-level noise [0.562479170374811]
We introduce the belief propagation plus ordered Tanner forest (BP+OTF) algorithm as an almost-linear time decoder for quantum low-density parity-check codes.
We show that the BP+OTF decoder achieves logical error suppression within an order of magnitude of state-of-the-art inversion-based decoders.
arXiv Detail & Related papers (2024-09-02T19:50:57Z) - Improved Belief Propagation Decoding Algorithms for Surface Codes [5.916355710767515]
Belief propagation (BP) is notable for its nearly linear time complexity.
BP's decoding accuracy without post-processing is unsatisfactory in most situations.
This article focuses on improving the decoding accuracy of BP over GF(4) for surface codes.
arXiv Detail & Related papers (2024-07-16T09:03:06Z) - Belief Propagation Decoding of Quantum LDPC Codes with Guided Decimation [55.8930142490617]
We propose a decoder for QLDPC codes based on BP guided decimation (BPGD)
BPGD significantly reduces the BP failure rate due to non-convergence.
arXiv Detail & Related papers (2023-12-18T05:58:07Z) - Graph Neural Networks for Enhanced Decoding of Quantum LDPC Codes [6.175503577352742]
We propose a differentiable iterative decoder for quantum low-density parity-check (LDPC) codes.
The proposed algorithm is composed of classical belief propagation (BP) decoding stages and intermediate graph neural network (GNN) layers.
arXiv Detail & Related papers (2023-10-26T19:56:25Z) - Neural Belief Propagation Decoding of Quantum LDPC Codes Using
Overcomplete Check Matrices [60.02503434201552]
We propose to decode QLDPC codes based on a check matrix with redundant rows, generated from linear combinations of the rows in the original check matrix.
This approach yields a significant improvement in decoding performance with the additional advantage of very low decoding latency.
arXiv Detail & Related papers (2022-12-20T13:41:27Z) - A Scalable Graph Neural Network Decoder for Short Block Codes [49.25571364253986]
We propose a novel decoding algorithm for short block codes based on an edge-weighted graph neural network (EW-GNN)
The EW-GNN decoder operates on the Tanner graph with an iterative message-passing structure.
We show that the EW-GNN decoder outperforms the BP and deep-learning-based BP methods in terms of the decoding error rate.
arXiv Detail & Related papers (2022-11-13T17:13:12Z) - Graph Neural Networks for Channel Decoding [71.15576353630667]
We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
arXiv Detail & Related papers (2022-07-29T15:29:18Z) - Log-domain decoding of quantum LDPC codes over binary finite fields [4.340338299803562]
We study the decoding of quantum low-density parity-check (LDPC) codes over binary finite fields GF$(q=2l)$ by the sum-product algorithm, also known as belief propagation (BP)
We show that scalar messages suffice for BP decoding of nonbinary quantum codes, rather than vector messages necessary for the conventional BP.
arXiv Detail & Related papers (2021-04-01T07:15:41Z) - Refined Belief-Propagation Decoding of Quantum Codes with Scalar
Messages [4.340338299803562]
Codes based on sparse matrices have good performance and can be efficiently decoded by belief-propagation (BP)
BP decoding of stabilizer codes suffers a performance loss from the short cycles in the underlying Tanner graph.
We show that running BP with message normalization according to a serial schedule may significantly improve the decoding performance and error-floor in computer simulation.
arXiv Detail & Related papers (2021-02-14T10:29:58Z) - On Sparsifying Encoder Outputs in Sequence-to-Sequence Models [90.58793284654692]
We take Transformer as the testbed and introduce a layer of gates in-between the encoder and the decoder.
The gates are regularized using the expected value of the sparsity-inducing L0penalty.
We investigate the effects of this sparsification on two machine translation and two summarization tasks.
arXiv Detail & Related papers (2020-04-24T16:57:52Z) - Pruning Neural Belief Propagation Decoders [77.237958592189]
We introduce a method to tailor an overcomplete parity-check matrix to (neural) BP decoding using machine learning.
We achieve performance within 0.27 dB and 1.5 dB of the ML performance while reducing the complexity of the decoder.
arXiv Detail & Related papers (2020-01-21T12:05:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.