Refined Belief-Propagation Decoding of Quantum Codes with Scalar
Messages
- URL: http://arxiv.org/abs/2102.07122v1
- Date: Sun, 14 Feb 2021 10:29:58 GMT
- Title: Refined Belief-Propagation Decoding of Quantum Codes with Scalar
Messages
- Authors: Kao-Yueh Kuo and Ching-Yi Lai
- Abstract summary: Codes based on sparse matrices have good performance and can be efficiently decoded by belief-propagation (BP)
BP decoding of stabilizer codes suffers a performance loss from the short cycles in the underlying Tanner graph.
We show that running BP with message normalization according to a serial schedule may significantly improve the decoding performance and error-floor in computer simulation.
- Score: 4.340338299803562
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Codes based on sparse matrices have good performance and can be efficiently
decoded by belief-propagation (BP). Decoding binary stabilizer codes needs a
quaternary BP for (additive) codes over GF(4), which has a higher check-node
complexity compared to a binary BP for codes over GF(2). Moreover, BP decoding
of stabilizer codes suffers a performance loss from the short cycles in the
underlying Tanner graph. In this paper, we propose a refined BP algorithm for
decoding quantum codes by passing scalar messages. For a given error syndrome,
this algorithm decodes to the same output as the conventional quaternary BP but
with a check-node complexity the same as binary BP. As every message is a
scalar, the message normalization can be naturally applied to improve the
performance. Another observation is that the message-update schedule affects
the BP decoding performance against short cycles. We show that running BP with
message normalization according to a serial schedule (or other schedules) may
significantly improve the decoding performance and error-floor in computer
simulation.
Related papers
- Improved Belief Propagation Decoding Algorithms for Surface Codes [5.916355710767515]
Belief propagation (BP) is notable for its nearly linear time complexity.
BP's decoding accuracy without post-processing is unsatisfactory in most situations.
This article focuses on improving the decoding accuracy of BP over GF(4) for surface codes.
arXiv Detail & Related papers (2024-07-16T09:03:06Z) - Belief Propagation Decoding of Quantum LDPC Codes with Guided Decimation [55.8930142490617]
We propose a decoder for QLDPC codes based on BP guided decimation (BPGD)
BPGD significantly reduces the BP failure rate due to non-convergence.
arXiv Detail & Related papers (2023-12-18T05:58:07Z) - Graph Neural Networks for Enhanced Decoding of Quantum LDPC Codes [6.175503577352742]
We propose a differentiable iterative decoder for quantum low-density parity-check (LDPC) codes.
The proposed algorithm is composed of classical belief propagation (BP) decoding stages and intermediate graph neural network (GNN) layers.
arXiv Detail & Related papers (2023-10-26T19:56:25Z) - Quaternary Neural Belief Propagation Decoding of Quantum LDPC Codes with
Overcomplete Check Matrices [45.997444794696676]
Quantum low-density parity-check (QLDPC) codes are promising candidates for error correction in quantum computers.
One of the major challenges in implementing QLDPC codes in quantum computers is the lack of a universal decoder.
We first propose to decode QLDPC codes with a belief propagation (BP) decoder operating on overcomplete check matrices.
We extend the neural BP (NBP) decoder, which was originally studied for suboptimal binary BP decoding of QLPDC codes, to quaternary BP decoders.
arXiv Detail & Related papers (2023-08-16T08:24:06Z) - Neural Belief Propagation Decoding of Quantum LDPC Codes Using
Overcomplete Check Matrices [60.02503434201552]
We propose to decode QLDPC codes based on a check matrix with redundant rows, generated from linear combinations of the rows in the original check matrix.
This approach yields a significant improvement in decoding performance with the additional advantage of very low decoding latency.
arXiv Detail & Related papers (2022-12-20T13:41:27Z) - Graph Neural Networks for Channel Decoding [71.15576353630667]
We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
arXiv Detail & Related papers (2022-07-29T15:29:18Z) - Log-domain decoding of quantum LDPC codes over binary finite fields [4.340338299803562]
We study the decoding of quantum low-density parity-check (LDPC) codes over binary finite fields GF$(q=2l)$ by the sum-product algorithm, also known as belief propagation (BP)
We show that scalar messages suffice for BP decoding of nonbinary quantum codes, rather than vector messages necessary for the conventional BP.
arXiv Detail & Related papers (2021-04-01T07:15:41Z) - On Sparsifying Encoder Outputs in Sequence-to-Sequence Models [90.58793284654692]
We take Transformer as the testbed and introduce a layer of gates in-between the encoder and the decoder.
The gates are regularized using the expected value of the sparsity-inducing L0penalty.
We investigate the effects of this sparsification on two machine translation and two summarization tasks.
arXiv Detail & Related papers (2020-04-24T16:57:52Z) - Refined Belief Propagation Decoding of Sparse-Graph Quantum Codes [4.340338299803562]
We propose a refined BP decoding algorithm for quantum codes with complexity roughly the same as binary BP.
For a given error syndrome, this algorithm decodes to the same output as the conventional quaternary BP, but the passed node-to-node messages are single-valued.
Message strength normalization can naturally be applied to these single-valued messages to improve the performance.
arXiv Detail & Related papers (2020-02-16T03:51:59Z) - Pruning Neural Belief Propagation Decoders [77.237958592189]
We introduce a method to tailor an overcomplete parity-check matrix to (neural) BP decoding using machine learning.
We achieve performance within 0.27 dB and 1.5 dB of the ML performance while reducing the complexity of the decoder.
arXiv Detail & Related papers (2020-01-21T12:05:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.