Graph Neural Networks for Channel Decoding
- URL: http://arxiv.org/abs/2207.14742v1
- Date: Fri, 29 Jul 2022 15:29:18 GMT
- Title: Graph Neural Networks for Channel Decoding
- Authors: Sebastian Cammerer, Jakob Hoydis, Fay\c{c}al A\"it Aoudia, and
Alexander Keller
- Abstract summary: We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
- Score: 71.15576353630667
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, we propose a fully differentiable graph neural network
(GNN)-based architecture for channel decoding and showcase competitive decoding
performance for various coding schemes, such as low-density parity-check (LDPC)
and BCH codes. The idea is to let a neural network (NN) learn a generalized
message passing algorithm over a given graph that represents the forward error
correction (FEC) code structure by replacing node and edge message updates with
trainable functions. Contrary to many other deep learning-based decoding
approaches, the proposed solution enjoys scalability to arbitrary block lengths
and the training is not limited by the curse of dimensionality. We benchmark
our proposed decoder against state-of-the-art in conventional channel decoding
as well as against recent deep learning-based results. For the (63,45) BCH
code, our solution outperforms weighted belief propagation (BP) decoding by
approximately 0.4 dB with significantly less decoding iterations and even for
5G NR LDPC codes, we observe a competitive performance when compared to
conventional BP decoding. For the BCH codes, the resulting GNN decoder can be
fully parametrized with only 9640 weights.
Related papers
- Decoding Quantum LDPC Codes Using Graph Neural Networks [52.19575718707659]
We propose a novel decoding method for Quantum Low-Density Parity-Check (QLDPC) codes based on Graph Neural Networks (GNNs)
The proposed GNN-based QLDPC decoder exploits the sparse graph structure of QLDPC codes and can be implemented as a message-passing decoding algorithm.
arXiv Detail & Related papers (2024-08-09T16:47:49Z) - Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - Graph Neural Networks for Enhanced Decoding of Quantum LDPC Codes [6.175503577352742]
We propose a differentiable iterative decoder for quantum low-density parity-check (LDPC) codes.
The proposed algorithm is composed of classical belief propagation (BP) decoding stages and intermediate graph neural network (GNN) layers.
arXiv Detail & Related papers (2023-10-26T19:56:25Z) - A Scalable Graph Neural Network Decoder for Short Block Codes [49.25571364253986]
We propose a novel decoding algorithm for short block codes based on an edge-weighted graph neural network (EW-GNN)
The EW-GNN decoder operates on the Tanner graph with an iterative message-passing structure.
We show that the EW-GNN decoder outperforms the BP and deep-learning-based BP methods in terms of the decoding error rate.
arXiv Detail & Related papers (2022-11-13T17:13:12Z) - Adversarial Neural Networks for Error Correcting Codes [76.70040964453638]
We introduce a general framework to boost the performance and applicability of machine learning (ML) models.
We propose to combine ML decoders with a competing discriminator network that tries to distinguish between codewords and noisy words.
Our framework is game-theoretic, motivated by generative adversarial networks (GANs)
arXiv Detail & Related papers (2021-12-21T19:14:44Z) - Cyclically Equivariant Neural Decoders for Cyclic Codes [33.63188063525036]
We propose a novel neural decoder for cyclic codes by exploiting their cyclically invariant property.
Our new decoder consistently outperforms previous neural decoders when decoding cyclic codes.
Finally, we propose a list decoding procedure that can significantly reduce the decoding error probability for BCH codes and punctured RM codes.
arXiv Detail & Related papers (2021-05-12T09:41:13Z) - Decoding 5G-NR Communications via Deep Learning [6.09170287691728]
We propose to use Autoencoding Neural Networks (ANN) jointly with a Deep Neural Network (DNN) to construct Autoencoding Deep Neural Networks (ADNN) for demapping and decoding.
Results will unveil that, for a particular BER target, $3$ dB less of Signal to Noise Ratio (SNR) is required, in Additive White Gaussian Noise (AWGN) channels.
arXiv Detail & Related papers (2020-07-15T12:00:20Z) - Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip [41.28049430114734]
We propose a novel regularization method called Infomax Adversarial-Bit-Flip (IABF) to improve the stability and robustness of the neural joint source-channel coding scheme.
Our IABF can achieve state-of-the-art performances on both compression and error correction benchmarks and outperform the baselines by a significant margin.
arXiv Detail & Related papers (2020-04-03T10:00:02Z) - Pruning Neural Belief Propagation Decoders [77.237958592189]
We introduce a method to tailor an overcomplete parity-check matrix to (neural) BP decoding using machine learning.
We achieve performance within 0.27 dB and 1.5 dB of the ML performance while reducing the complexity of the decoder.
arXiv Detail & Related papers (2020-01-21T12:05:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.