DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning
- URL: http://arxiv.org/abs/2402.08864v2
- Date: Wed, 5 Jun 2024 02:05:13 GMT
- Title: DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning
- Authors: S Ashwin Hebbar, Sravan Kumar Ankireddy, Hyeji Kim, Sewoong Oh, Pramod Viswanath,
- Abstract summary: Polar codes have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes.
DeepPolar codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks.
Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes.
- Score: 36.10365210143751
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Progress in designing channel codes has been driven by human ingenuity and, fittingly, has been sporadic. Polar codes, developed on the foundation of Arikan's polarization kernel, represent the latest breakthrough in coding theory and have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes. In an effort to automate the invention of good channel codes, especially in this regime, we explore a novel, non-linear generalization of Polar codes, which we call DeepPolar codes. DeepPolar codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks. Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes.
Related papers
- Factor Graph Optimization of Error-Correcting Codes for Belief Propagation Decoding [62.25533750469467]
Low-Density Parity-Check (LDPC) codes possess several advantages over other families of codes.
The proposed approach is shown to outperform the decoding performance of existing popular codes by orders of magnitude.
arXiv Detail & Related papers (2024-06-09T12:08:56Z) - Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - Flexible polar encoding for information reconciliation in QKD [2.627883025193776]
Quantum Key Distribution (QKD) enables two parties to establish a common secret key that is information-theoretically secure.
Errors that are generally considered to be due to the adversary's tempering with the quantum channel need to be corrected using classical communication over a public channel.
We show that the reliability sequence can be derived and used to design an encoder independent of the choice of decoder.
arXiv Detail & Related papers (2023-11-30T16:01:10Z) - Optimizing Serially Concatenated Neural Codes with Classical Decoders [8.692972779213932]
We show that a classical decoding algorithm is applied to a non-trivial, real-valued neural code.
As the BCJR algorithm is fully differentiable, it is possible to train, or fine-tune, the neural encoder in an end-to-end fashion.
arXiv Detail & Related papers (2022-12-20T15:40:08Z) - Neural Belief Propagation Decoding of Quantum LDPC Codes Using
Overcomplete Check Matrices [60.02503434201552]
We propose to decode QLDPC codes based on a check matrix with redundant rows, generated from linear combinations of the rows in the original check matrix.
This approach yields a significant improvement in decoding performance with the additional advantage of very low decoding latency.
arXiv Detail & Related papers (2022-12-20T13:41:27Z) - CRISP: Curriculum based Sequential Neural Decoders for Polar Code Family [45.74928228858547]
We introduce a novel $textbfC$urtextbfRI$culum based $textbfS$equential neural decoder for $textbfP$olar codes (CRISP)
We show that CRISP attains near-optimal reliability performance on the Polar(32,16) and Polar(64,22) codes.
CRISP can be readily extended to Polarization-Adjusted-Convolutional (PAC) codes, where existing SC decoders are significantly less reliable.
arXiv Detail & Related papers (2022-10-01T16:26:24Z) - KO codes: Inventing Nonlinear Encoding and Decoding for Reliable
Wireless Communication via Deep-learning [76.5589486928387]
Landmark codes underpin reliable physical layer communication, e.g., Reed-Muller, BCH, Convolution, Turbo, LDPC and Polar codes.
In this paper, we construct KO codes, a computationaly efficient family of deep-learning driven (encoder, decoder) pairs.
KO codes beat state-of-the-art Reed-Muller and Polar codes, under the low-complexity successive cancellation decoding.
arXiv Detail & Related papers (2021-08-29T21:08:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.