CRISP: Curriculum based Sequential Neural Decoders for Polar Code Family
- URL: http://arxiv.org/abs/2210.00313v3
- Date: Mon, 29 May 2023 11:55:11 GMT
- Title: CRISP: Curriculum based Sequential Neural Decoders for Polar Code Family
- Authors: S Ashwin Hebbar, Viraj Nadkarni, Ashok Vardhan Makkuva, Suma Bhat,
Sewoong Oh, Pramod Viswanath
- Abstract summary: We introduce a novel $textbfC$urtextbfRI$culum based $textbfS$equential neural decoder for $textbfP$olar codes (CRISP)
We show that CRISP attains near-optimal reliability performance on the Polar(32,16) and Polar(64,22) codes.
CRISP can be readily extended to Polarization-Adjusted-Convolutional (PAC) codes, where existing SC decoders are significantly less reliable.
- Score: 45.74928228858547
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Polar codes are widely used state-of-the-art codes for reliable communication
that have recently been included in the 5th generation wireless standards (5G).
However, there remains room for the design of polar decoders that are both
efficient and reliable in the short blocklength regime. Motivated by recent
successes of data-driven channel decoders, we introduce a novel
$\textbf{C}$ur$\textbf{RI}$culum based $\textbf{S}$equential neural decoder for
$\textbf{P}$olar codes (CRISP). We design a principled curriculum, guided by
information-theoretic insights, to train CRISP and show that it outperforms the
successive-cancellation (SC) decoder and attains near-optimal reliability
performance on the Polar(32,16) and Polar(64,22) codes. The choice of the
proposed curriculum is critical in achieving the accuracy gains of CRISP, as we
show by comparing against other curricula. More notably, CRISP can be readily
extended to Polarization-Adjusted-Convolutional (PAC) codes, where existing SC
decoders are significantly less reliable. To the best of our knowledge, CRISP
constructs the first data-driven decoder for PAC codes and attains near-optimal
performance on the PAC(32,16) code.
Related papers
- Factor Graph Optimization of Error-Correcting Codes for Belief Propagation Decoding [62.25533750469467]
Low-Density Parity-Check (LDPC) codes possess several advantages over other families of codes.
The proposed approach is shown to outperform the decoding performance of existing popular codes by orders of magnitude.
arXiv Detail & Related papers (2024-06-09T12:08:56Z) - Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - On Leveraging Encoder-only Pre-trained Language Models for Effective
Keyphrase Generation [76.52997424694767]
This study addresses the application of encoder-only Pre-trained Language Models (PLMs) in keyphrase generation (KPG)
With encoder-only PLMs, although KPE with Conditional Random Fields slightly excels in identifying present keyphrases, the KPG formulation renders a broader spectrum of keyphrase predictions.
We also identify a favorable parameter allocation towards model depth rather than width when employing encoder-decoder architectures with encoder-only PLMs.
arXiv Detail & Related papers (2024-02-21T18:57:54Z) - DeepPolar: Inventing Nonlinear Large-Kernel Polar Codes via Deep Learning [36.10365210143751]
Polar codes have emerged as the state-of-the-art error-correction code for short-to-medium block length regimes.
DeepPolar codes extend the conventional Polar coding framework by utilizing a larger kernel size and parameterizing these kernels and matched decoders through neural networks.
Our results demonstrate that these data-driven codes effectively leverage the benefits of a larger kernel size, resulting in enhanced reliability when compared to both existing neural codes and conventional Polar codes.
arXiv Detail & Related papers (2024-02-14T00:18:10Z) - Flexible polar encoding for information reconciliation in QKD [2.627883025193776]
Quantum Key Distribution (QKD) enables two parties to establish a common secret key that is information-theoretically secure.
Errors that are generally considered to be due to the adversary's tempering with the quantum channel need to be corrected using classical communication over a public channel.
We show that the reliability sequence can be derived and used to design an encoder independent of the choice of decoder.
arXiv Detail & Related papers (2023-11-30T16:01:10Z) - Graph Neural Networks for Channel Decoding [71.15576353630667]
We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
arXiv Detail & Related papers (2022-07-29T15:29:18Z) - Scalable Polar Code Construction for Successive Cancellation List
Decoding: A Graph Neural Network-Based Approach [11.146177972345138]
This paper first maps a polar code to a heterogeneous graph called the polar-code-construction message-passing graph.
Next, a graph-neural-network-based iterative message-passing algorithm is proposed which aims to find a PCCMP graph that corresponds to the polar code.
Numerical experiments show that IMP-based polar-code constructions outperform classical constructions under CA-SCL decoding.
arXiv Detail & Related papers (2022-07-03T19:27:43Z) - KO codes: Inventing Nonlinear Encoding and Decoding for Reliable
Wireless Communication via Deep-learning [76.5589486928387]
Landmark codes underpin reliable physical layer communication, e.g., Reed-Muller, BCH, Convolution, Turbo, LDPC and Polar codes.
In this paper, we construct KO codes, a computationaly efficient family of deep-learning driven (encoder, decoder) pairs.
KO codes beat state-of-the-art Reed-Muller and Polar codes, under the low-complexity successive cancellation decoding.
arXiv Detail & Related papers (2021-08-29T21:08:30Z) - Combining hard and soft decoders for hypergraph product codes [0.3326320568999944]
Hypergraph product codes are constant-rate quantum low-density parity-check (LDPC) codes equipped with a linear-time decoder called small-set-flip (SSF)
This decoder displays sub-optimal performance in practice and requires very large error correcting codes to be effective.
We present new hybrid decoders that combine the belief propagation (BP) algorithm with the SSF decoder.
arXiv Detail & Related papers (2020-04-23T14:48:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.