Deep Extended Feedback Codes
- URL: http://arxiv.org/abs/2105.01365v1
- Date: Tue, 4 May 2021 08:41:14 GMT
- Title: Deep Extended Feedback Codes
- Authors: Anahid Robert Safavi, Alberto G. Perotti, Branislav M. Popovic, Mahdi
Boloursaz Mashhadi, Deniz Gunduz
- Abstract summary: The encoder in the DEF architecture transmits an information message followed by a sequence of parity symbols.
DEF codes generalize Deepcode in several ways to provide better error correction capability.
Performance evaluations show that DEF codes have better performance compared to other DNN-based codes for channels with feedback.
- Score: 9.112162560071937
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A new deep-neural-network (DNN) based error correction encoder architecture
for channels with feedback, called Deep Extended Feedback (DEF), is presented
in this paper. The encoder in the DEF architecture transmits an information
message followed by a sequence of parity symbols which are generated based on
the message as well as the observations of the past forward channel outputs
sent to the transmitter through a feedback channel. DEF codes generalize
Deepcode [1] in several ways: parity symbols are generated based on
forward-channel output observations over longer time intervals in order to
provide better error correction capability; and high-order modulation formats
are deployed in the encoder so as to achieve increased spectral efficiency.
Performance evaluations show that DEF codes have better performance compared to
other DNN-based codes for channels with feedback.
Related papers
- Factor Graph Optimization of Error-Correcting Codes for Belief Propagation Decoding [62.25533750469467]
Low-Density Parity-Check (LDPC) codes possess several advantages over other families of codes.
The proposed approach is shown to outperform the decoding performance of existing popular codes by orders of magnitude.
arXiv Detail & Related papers (2024-06-09T12:08:56Z) - Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - Joint Channel Estimation and Feedback with Masked Token Transformers in
Massive MIMO Systems [74.52117784544758]
This paper proposes an encoder-decoder based network that unveils the intrinsic frequency-domain correlation within the CSI matrix.
The entire encoder-decoder network is utilized for channel compression.
Our method outperforms state-of-the-art channel estimation and feedback techniques in joint tasks.
arXiv Detail & Related papers (2023-06-08T06:15:17Z) - Robust Non-Linear Feedback Coding via Power-Constrained Deep Learning [7.941112438865385]
We develop a new family of non-linear feedback codes that greatly enhance robustness to channel noise.
Our autoencoder-based architecture is designed to learn codes based on consecutive blocks of bits.
We show that our scheme outperforms state-of-the-art feedback codes by wide margins over practical forward and feedback noise regimes.
arXiv Detail & Related papers (2023-04-25T22:21:26Z) - Deep Joint Source-Channel Coding with Iterative Source Error Correction [11.41076729592696]
We propose an iterative source error correction (ISEC) decoding scheme for deep-learning-based joint source-channel code (Deep J SCC)
Given a noisyword received through the channel, we use a Deep J SCC encoder and decoder pair to update the code iteratively.
The proposed scheme produces more reliable source reconstruction results compared to the baseline when the channel noise characteristics do not match the ones used during training.
arXiv Detail & Related papers (2023-02-17T22:50:58Z) - A Scalable Graph Neural Network Decoder for Short Block Codes [49.25571364253986]
We propose a novel decoding algorithm for short block codes based on an edge-weighted graph neural network (EW-GNN)
The EW-GNN decoder operates on the Tanner graph with an iterative message-passing structure.
We show that the EW-GNN decoder outperforms the BP and deep-learning-based BP methods in terms of the decoding error rate.
arXiv Detail & Related papers (2022-11-13T17:13:12Z) - Feedback is Good, Active Feedback is Better: Block Attention Active
Feedback Codes [13.766611137136168]
We show that GBAF codes can also be used for channels with active feedback.
We implement a pair of transformer architectures, at the transmitter and the receiver, which interact with each other sequentially.
We achieve a new state-of-the-art BLER performance, especially in the low SNR regime.
arXiv Detail & Related papers (2022-11-03T11:44:06Z) - Denoising Diffusion Error Correction Codes [92.10654749898927]
Recently, neural decoders have demonstrated their advantage over classical decoding techniques.
Recent state-of-the-art neural decoders suffer from high complexity and lack the important iterative scheme characteristic of many legacy decoders.
We propose to employ denoising diffusion models for the soft decoding of linear codes at arbitrary block lengths.
arXiv Detail & Related papers (2022-09-16T11:00:50Z) - Graph Neural Networks for Channel Decoding [71.15576353630667]
We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
arXiv Detail & Related papers (2022-07-29T15:29:18Z) - DRF Codes: Deep SNR-Robust Feedback Codes [2.6074034431152344]
We present a new deep-neural-network (DNN) based error correction code for fading channels with output feedback, called deep SNR-robust feedback (DRF) code.
We show that the DRF codes significantly outperform state-of-the-art in terms of both the SNR-robustness and the error rate in additive white Gaussian noise (AWGN) channel with feedback.
arXiv Detail & Related papers (2021-12-22T10:47:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.