Neural Network Decoders for Permutation Codes Correcting Different
Errors
- URL: http://arxiv.org/abs/2206.03315v1
- Date: Tue, 7 Jun 2022 14:02:32 GMT
- Title: Neural Network Decoders for Permutation Codes Correcting Different
Errors
- Authors: Yeow Meng Chee, Hui Zhang
- Abstract summary: Permutation codes were extensively studied in order to correct different types of errors for the applications on power line communication and rank modulation for flash memory.
We introduce the neural network decoders for permutation codes to correct these errors with one-shot decoding.
- Score: 20.70208671558469
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Permutation codes were extensively studied in order to correct different
types of errors for the applications on power line communication and rank
modulation for flash memory. In this paper, we introduce the neural network
decoders for permutation codes to correct these errors with one-shot decoding,
which treat the decoding as $n$ classification tasks for non-binary symbols for
a code of length $n$. These are actually the first general decoders introduced
to deal with any error type for these two applications. The performance of the
decoders is evaluated by simulations with different error models.
Related papers
- On the Design and Performance of Machine Learning Based Error Correcting Decoders [3.8289109929360245]
We first consider the so-called single-label neural network (SLNN) and the multi-label neural network (MLNN) decoders which have been reported to achieve near maximum likelihood (ML) performance.
We then turn our attention to two transformer-based decoders: the error correction code transformer (ECCT) and the cross-attention message passing transformer (CrossMPT)
arXiv Detail & Related papers (2024-10-21T11:23:23Z) - Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - Testing the Accuracy of Surface Code Decoders [55.616364225463066]
Large-scale, fault-tolerant quantum computations will be enabled by quantum error-correcting codes (QECC)
This work presents the first systematic technique to test the accuracy and effectiveness of different QECC decoding schemes.
arXiv Detail & Related papers (2023-11-21T10:22:08Z) - Minimising surface-code failures using a color-code decoder [2.5499055723658097]
We propose a decoder for the surface code that finds low-weight correction operators for errors produced by the depolarising noise model.
The decoder is obtained by mapping the syndrome of the surface code onto that of the color code.
arXiv Detail & Related papers (2023-06-28T18:04:49Z) - The END: An Equivariant Neural Decoder for Quantum Error Correction [73.4384623973809]
We introduce a data efficient neural decoder that exploits the symmetries of the problem.
We propose a novel equivariant architecture that achieves state of the art accuracy compared to previous neural decoders.
arXiv Detail & Related papers (2023-04-14T19:46:39Z) - A Scalable Graph Neural Network Decoder for Short Block Codes [49.25571364253986]
We propose a novel decoding algorithm for short block codes based on an edge-weighted graph neural network (EW-GNN)
The EW-GNN decoder operates on the Tanner graph with an iterative message-passing structure.
We show that the EW-GNN decoder outperforms the BP and deep-learning-based BP methods in terms of the decoding error rate.
arXiv Detail & Related papers (2022-11-13T17:13:12Z) - Pre-Training Transformer Decoder for End-to-End ASR Model with Unpaired
Speech Data [145.95460945321253]
We introduce two pre-training tasks for the encoder-decoder network using acoustic units, i.e., pseudo codes.
The proposed Speech2C can relatively reduce the word error rate (WER) by 19.2% over the method without decoder pre-training.
arXiv Detail & Related papers (2022-03-31T15:33:56Z) - Adversarial Neural Networks for Error Correcting Codes [76.70040964453638]
We introduce a general framework to boost the performance and applicability of machine learning (ML) models.
We propose to combine ML decoders with a competing discriminator network that tries to distinguish between codewords and noisy words.
Our framework is game-theoretic, motivated by generative adversarial networks (GANs)
arXiv Detail & Related papers (2021-12-21T19:14:44Z) - FAID Diversity via Neural Networks [23.394836086114413]
We propose a new approach to design the decoder diversity of finite alphabet iterative decoders (FAIDs) for Low-Density Parity Check (LDPC) codes.
The proposed decoder diversity is achieved by training a recurrent quantized neural network (RQNN) to learn/design FAIDs.
arXiv Detail & Related papers (2021-05-10T05:14:42Z) - perm2vec: Graph Permutation Selection for Decoding of Error Correction
Codes using Self-Attention [19.879263834757758]
We present a data-driven framework for permutation selection, combining domain knowledge with machine learning concepts.
This work is the first to leverage the benefits of the neural Transformer networks in physical layer communication systems.
arXiv Detail & Related papers (2020-02-06T15:42:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.