Deep Joint Source-Channel Coding with Iterative Source Error Correction
- URL: http://arxiv.org/abs/2302.09174v1
- Date: Fri, 17 Feb 2023 22:50:58 GMT
- Title: Deep Joint Source-Channel Coding with Iterative Source Error Correction
- Authors: Changwoo Lee, Xiao Hu, Hun-Seok Kim
- Abstract summary: We propose an iterative source error correction (ISEC) decoding scheme for deep-learning-based joint source-channel code (Deep J SCC)
Given a noisyword received through the channel, we use a Deep J SCC encoder and decoder pair to update the code iteratively.
The proposed scheme produces more reliable source reconstruction results compared to the baseline when the channel noise characteristics do not match the ones used during training.
- Score: 11.41076729592696
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose an iterative source error correction (ISEC)
decoding scheme for deep-learning-based joint source-channel coding (Deep
JSCC). Given a noisy codeword received through the channel, we use a Deep JSCC
encoder and decoder pair to update the codeword iteratively to find a
(modified) maximum a-posteriori (MAP) solution. For efficient MAP decoding, we
utilize a neural network-based denoiser to approximate the gradient of the
log-prior density of the codeword space. Albeit the non-convexity of the
optimization problem, our proposed scheme improves various distortion and
perceptual quality metrics from the conventional one-shot (non-iterative) Deep
JSCC decoding baseline. Furthermore, the proposed scheme produces more reliable
source reconstruction results compared to the baseline when the channel noise
characteristics do not match the ones used during training.
Related papers
- Factor Graph Optimization of Error-Correcting Codes for Belief Propagation Decoding [62.25533750469467]
Low-Density Parity-Check (LDPC) codes possess several advantages over other families of codes.
The proposed approach is shown to outperform the decoding performance of existing popular codes by orders of magnitude.
arXiv Detail & Related papers (2024-06-09T12:08:56Z) - Learning Linear Block Error Correction Codes [62.25533750469467]
We propose for the first time a unified encoder-decoder training of binary linear block codes.
We also propose a novel Transformer model in which the self-attention masking is performed in a differentiable fashion for the efficient backpropagation of the code gradient.
arXiv Detail & Related papers (2024-05-07T06:47:12Z) - The Rate-Distortion-Perception-Classification Tradeoff: Joint Source Coding and Modulation via Inverse-Domain GANs [4.735670734773145]
We show the existence of a strict tradeoff between channel rate, distortion perception, and classification accuracy.
We propose two image compression methods to navigate that tradeoff: theCO algorithm and ID-GAN, which is more general compression.
They also demonstrate that the proposed ID-GAN algorithm balances image distortion, perception, classification accuracy, and significantly outperforms traditional separation-based methods.
arXiv Detail & Related papers (2023-12-22T16:06:43Z) - Graph Neural Networks for Enhanced Decoding of Quantum LDPC Codes [6.175503577352742]
We propose a differentiable iterative decoder for quantum low-density parity-check (LDPC) codes.
The proposed algorithm is composed of classical belief propagation (BP) decoding stages and intermediate graph neural network (GNN) layers.
arXiv Detail & Related papers (2023-10-26T19:56:25Z) - Deep Quantum Error Correction [73.54643419792453]
Quantum error correction codes (QECC) are a key component for realizing the potential of quantum computing.
In this work, we efficiently train novel emphend-to-end deep quantum error decoders.
The proposed method demonstrates the power of neural decoders for QECC by achieving state-of-the-art accuracy.
arXiv Detail & Related papers (2023-01-27T08:16:26Z) - Generative Joint Source-Channel Coding for Semantic Image Transmission [29.738666406095074]
Joint source-channel coding (JSCC) schemes using deep neural networks (DNNs) provide promising results in wireless image transmission.
We propose two novel J SCC schemes that leverage the perceptual quality of deep generative models (DGMs) for wireless image transmission.
arXiv Detail & Related papers (2022-11-24T19:14:27Z) - Denoising Diffusion Error Correction Codes [92.10654749898927]
Recently, neural decoders have demonstrated their advantage over classical decoding techniques.
Recent state-of-the-art neural decoders suffer from high complexity and lack the important iterative scheme characteristic of many legacy decoders.
We propose to employ denoising diffusion models for the soft decoding of linear codes at arbitrary block lengths.
arXiv Detail & Related papers (2022-09-16T11:00:50Z) - Graph Neural Networks for Channel Decoding [71.15576353630667]
We showcase competitive decoding performance for various coding schemes, such as low-density parity-check (LDPC) and BCH codes.
The idea is to let a neural network (NN) learn a generalized message passing algorithm over a given graph.
We benchmark our proposed decoder against state-of-the-art in conventional channel decoding as well as against recent deep learning-based results.
arXiv Detail & Related papers (2022-07-29T15:29:18Z) - Plug-And-Play Learned Gaussian-mixture Approximate Message Passing [71.74028918819046]
We propose a plug-and-play compressed sensing (CS) recovery algorithm suitable for any i.i.d. source prior.
Our algorithm builds upon Borgerding's learned AMP (LAMP), yet significantly improves it by adopting a universal denoising function within the algorithm.
Numerical evaluation shows that the L-GM-AMP algorithm achieves state-of-the-art performance without any knowledge of the source prior.
arXiv Detail & Related papers (2020-11-18T16:40:45Z) - Infomax Neural Joint Source-Channel Coding via Adversarial Bit Flip [41.28049430114734]
We propose a novel regularization method called Infomax Adversarial-Bit-Flip (IABF) to improve the stability and robustness of the neural joint source-channel coding scheme.
Our IABF can achieve state-of-the-art performances on both compression and error correction benchmarks and outperform the baselines by a significant margin.
arXiv Detail & Related papers (2020-04-03T10:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.