Nested Construction of Polar Codes via Transformers
- URL: http://arxiv.org/abs/2401.17188v1
- Date: Tue, 30 Jan 2024 17:17:43 GMT
- Title: Nested Construction of Polar Codes via Transformers
- Authors: Sravan Kumar Ankireddy, S Ashwin Hebbar, Heping Wan, Joonyoung Cho,
Charlie Zhang
- Abstract summary: We propose using a sequence modeling framework to iteratively construct a polar code for any given length and rate under various channel conditions.
Simulations show that polar codes designed via sequential modeling using transformers outperform both 5G-NR sequence and Density Evolution based approaches for both AWGN and Rayleigh fading channels.
- Score: 3.2841640957249285
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tailoring polar code construction for decoding algorithms beyond successive
cancellation has remained a topic of significant interest in the field.
However, despite the inherent nested structure of polar codes, the use of
sequence models in polar code construction is understudied. In this work, we
propose using a sequence modeling framework to iteratively construct a polar
code for any given length and rate under various channel conditions.
Simulations show that polar codes designed via sequential modeling using
transformers outperform both 5G-NR sequence and Density Evolution based
approaches for both AWGN and Rayleigh fading channels.
Related papers
- Towards Training Without Depth Limits: Batch Normalization Without
Gradient Explosion [83.90492831583997]
We show that a batch-normalized network can keep the optimal signal propagation properties, but avoid exploding gradients in depth.
We use a Multi-Layer Perceptron (MLP) with linear activations and batch-normalization that provably has bounded depth.
We also design an activation shaping scheme that empirically achieves the same properties for certain non-linear activations.
arXiv Detail & Related papers (2023-10-03T12:35:02Z) - Deep Polar Codes [19.265010348250897]
We introduce a novel class of pre-transformed polar codes, termed as deep polar codes.
We first present a deep polar encoder that harnesses a series of multi-layered polar transformations with varying sizes.
Our encoding method offers flexibility in rate-profiling, embracing a wide range of code rates and blocklengths.
arXiv Detail & Related papers (2023-08-06T03:29:18Z) - Complexity Matters: Rethinking the Latent Space for Generative Modeling [65.64763873078114]
In generative modeling, numerous successful approaches leverage a low-dimensional latent space, e.g., Stable Diffusion.
In this study, we aim to shed light on this under-explored topic by rethinking the latent space from the perspective of model complexity.
arXiv Detail & Related papers (2023-07-17T07:12:29Z) - Denoising Diffusion Error Correction Codes [92.10654749898927]
Recently, neural decoders have demonstrated their advantage over classical decoding techniques.
Recent state-of-the-art neural decoders suffer from high complexity and lack the important iterative scheme characteristic of many legacy decoders.
We propose to employ denoising diffusion models for the soft decoding of linear codes at arbitrary block lengths.
arXiv Detail & Related papers (2022-09-16T11:00:50Z) - Scalable Polar Code Construction for Successive Cancellation List
Decoding: A Graph Neural Network-Based Approach [11.146177972345138]
This paper first maps a polar code to a heterogeneous graph called the polar-code-construction message-passing graph.
Next, a graph-neural-network-based iterative message-passing algorithm is proposed which aims to find a PCCMP graph that corresponds to the polar code.
Numerical experiments show that IMP-based polar-code constructions outperform classical constructions under CA-SCL decoding.
arXiv Detail & Related papers (2022-07-03T19:27:43Z) - Hybrid Neural Coded Modulation: Design and Training Methods [16.778378666167026]
inner code is designed using a deep neural network (DNN) which takes the channel coded bits and outputs modulated symbols.
The resulting constellations are shown to outperform the conventional quadrature amplitude modulation (QAM) based coding scheme for modulation order 16 and 64 with 5G standard LDPC codes.
arXiv Detail & Related papers (2022-02-04T05:04:15Z) - Do Generative Models Know Disentanglement? Contrastive Learning is All
You Need [59.033559925639075]
We propose an unsupervised and model-agnostic method: Disentanglement via Contrast (DisCo) in the Variation Space.
DisCo achieves the state-of-the-art disentanglement given pretrained non-disentangled generative models, including GAN, VAE, and Flow.
arXiv Detail & Related papers (2021-02-21T08:01:20Z) - Intermediate Layer Optimization for Inverse Problems using Deep
Generative Models [86.29330440222199]
ILO is a novel optimization algorithm for solving inverse problems with deep generative models.
We empirically show that our approach outperforms state-of-the-art methods introduced in StyleGAN-2 and PULSE for a wide range of inverse problems.
arXiv Detail & Related papers (2021-02-15T06:52:22Z) - Positional Encoding as Spatial Inductive Bias in GANs [97.6622154941448]
SinGAN shows impressive capability in learning internal patch distribution despite its limited effective receptive field.
In this work, we show that such capability, to a large extent, is brought by the implicit positional encoding when using zero padding in the generators.
We propose a new multi-scale training strategy and demonstrate its effectiveness in the state-of-the-art unconditional generator StyleGAN2.
arXiv Detail & Related papers (2020-12-09T18:27:16Z) - Construction of Polar Codes with Reinforcement Learning [13.977646909897796]
This paper formulates the polar-code construction problem for the successive-cancellation list (SCL) decoder as a maze-traversing game.
The proposed method provides a novel technique for polar-code construction that no longer depends on sorting and selecting bit-channels by reliability.
arXiv Detail & Related papers (2020-09-19T17:59:02Z) - Multilevel Polarization for Quantum Channels [5.607676459156789]
We consider the quantum polar code construction using the same channel combining and splitting procedure as in [1], but with a fixed two-qubit Clifford unitary.
For the family of Pauli channels, we show that polarization happens in multi-levels, where synthesized quantum virtual channels tend to become completely noisy, half-noisy, or noiseless.
We present a quantum polar code exploiting the multilevel nature of polarization, and provide an efficient decoding for this code.
arXiv Detail & Related papers (2020-06-22T22:33:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.