Generative AI-Based Probabilistic Constellation Shaping With Diffusion
Models
- URL: http://arxiv.org/abs/2311.09349v1
- Date: Wed, 15 Nov 2023 20:14:21 GMT
- Title: Generative AI-Based Probabilistic Constellation Shaping With Diffusion
Models
- Authors: Mehdi Letafati, Samad Ali, and Matti Latva-aho
- Abstract summary: We aim to unleash the power of generative AI for PHY design of constellation symbols in communication systems.
We exploit the denoise-and-generate'' characteristics of diffusion probabilistic models (DDPM) for probabilistic constellation shaping.
Our results show that the generative AI-based scheme outperforms deep neural network (DNN)-based benchmark and uniform shaping.
- Score: 12.218161437914118
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Diffusion models are at the vanguard of generative AI research with renowned
solutions such as ImageGen by Google Brain and DALL.E 3 by OpenAI.
Nevertheless, the potential merits of diffusion models for communication
engineering applications are not fully understood yet. In this paper, we aim to
unleash the power of generative AI for PHY design of constellation symbols in
communication systems. Although the geometry of constellations is predetermined
according to networking standards, e.g., quadrature amplitude modulation (QAM),
probabilistic shaping can design the probability of occurrence (generation) of
constellation symbols. This can help improve the information rate and decoding
performance of communication systems. We exploit the ``denoise-and-generate''
characteristics of denoising diffusion probabilistic models (DDPM) for
probabilistic constellation shaping. The key idea is to learn generating
constellation symbols out of noise, ``mimicking'' the way the receiver performs
symbol reconstruction. This way, we make the constellation symbols sent by the
transmitter, and what is inferred (reconstructed) at the receiver become as
similar as possible, resulting in as few mismatches as possible. Our results
show that the generative AI-based scheme outperforms deep neural network
(DNN)-based benchmark and uniform shaping, while providing network resilience
as well as robust out-of-distribution performance under low-SNR regimes and
non-Gaussian assumptions. Numerical evaluations highlight 30% improvement in
terms of cosine similarity and a threefold improvement in terms of mutual
information compared to DNN-based approach for 64-QAM geometry.
Related papers
- Diffusion Models for Wireless Communications [12.218161437914118]
We outline the applications of diffusion models in wireless communication systems.
The key idea is to decompose data generation process over "denoising" steps, gradually generating samples out of noise.
We show how diffusion models can be employed for the development of resilient AI-native communication systems.
arXiv Detail & Related papers (2023-10-11T08:57:59Z) - SatDM: Synthesizing Realistic Satellite Image with Semantic Layout
Conditioning using Diffusion Models [0.0]
Denoising Diffusion Probabilistic Models (DDPMs) have demonstrated significant promise in synthesizing realistic images from semantic layouts.
In this paper, a conditional DDPM model capable of taking a semantic map and generating high-quality, diverse, and correspondingly accurate satellite images is implemented.
The effectiveness of our proposed model is validated using a meticulously labeled dataset introduced within the context of this study.
arXiv Detail & Related papers (2023-09-28T19:39:13Z) - Complexity Matters: Rethinking the Latent Space for Generative Modeling [65.64763873078114]
In generative modeling, numerous successful approaches leverage a low-dimensional latent space, e.g., Stable Diffusion.
In this study, we aim to shed light on this under-explored topic by rethinking the latent space from the perspective of model complexity.
arXiv Detail & Related papers (2023-07-17T07:12:29Z) - Radio Generation Using Generative Adversarial Networks with An Unrolled
Design [18.049453261384013]
We develop a novel GAN framework for radio generation called "Radio GAN"
The first is learning based on sampling points, which aims to model an underlying sampling distribution of radio signals.
The second is an unrolled generator design, combined with an estimated pure signal distribution as a prior, which can greatly reduce learning difficulty.
arXiv Detail & Related papers (2023-06-24T07:47:22Z) - Explainable, Physics Aware, Trustworthy AI Paradigm Shift for Synthetic
Aperture Radar [5.164409209168982]
We propose a change of paradigm for explainability in data science for the case of Synthetic Aperture Radar (SAR) data.
It aims to use explainable data transformations based on well-established models to generate inputs for AI methods.
arXiv Detail & Related papers (2023-01-09T09:22:13Z) - DINER: Disorder-Invariant Implicit Neural Representation [33.10256713209207]
Implicit neural representation (INR) characterizes the attributes of a signal as a function of corresponding coordinates.
We propose the disorder-invariant implicit neural representation (DINER) by augmenting a hash-table to a traditional INR backbone.
arXiv Detail & Related papers (2022-11-15T03:34:24Z) - Learning to Estimate RIS-Aided mmWave Channels [50.15279409856091]
We focus on uplink cascaded channel estimation, where known and fixed base station combining and RIS phase control matrices are considered for collecting observations.
To boost the estimation performance and reduce the training overhead, the inherent channel sparsity of mmWave channels is leveraged in the deep unfolding method.
It is verified that the proposed deep unfolding network architecture can outperform the least squares (LS) method with a relatively smaller training overhead and online computational complexity.
arXiv Detail & Related papers (2021-07-27T06:57:56Z) - On the benefits of robust models in modulation recognition [53.391095789289736]
Deep Neural Networks (DNNs) using convolutional layers are state-of-the-art in many tasks in communications.
In other domains, like image classification, DNNs have been shown to be vulnerable to adversarial perturbations.
We propose a novel framework to test the robustness of current state-of-the-art models.
arXiv Detail & Related papers (2021-03-27T19:58:06Z) - Spatial Dependency Networks: Neural Layers for Improved Generative Image
Modeling [79.15521784128102]
We introduce a novel neural network for building image generators (decoders) and apply it to variational autoencoders (VAEs)
In our spatial dependency networks (SDNs), feature maps at each level of a deep neural net are computed in a spatially coherent way.
We show that augmenting the decoder of a hierarchical VAE by spatial dependency layers considerably improves density estimation.
arXiv Detail & Related papers (2021-03-16T07:01:08Z) - Denoising Diffusion Probabilistic Models [91.94962645056896]
We present high quality image synthesis results using diffusion probabilistic models.
Our best results are obtained by training on a weighted variational bound designed according to a novel connection between diffusion probabilistic models and denoising score matching with Langevin dynamics.
arXiv Detail & Related papers (2020-06-19T17:24:44Z) - Data-Driven Symbol Detection via Model-Based Machine Learning [117.58188185409904]
We review a data-driven framework to symbol detection design which combines machine learning (ML) and model-based algorithms.
In this hybrid approach, well-known channel-model-based algorithms are augmented with ML-based algorithms to remove their channel-model-dependence.
Our results demonstrate that these techniques can yield near-optimal performance of model-based algorithms without knowing the exact channel input-output statistical relationship.
arXiv Detail & Related papers (2020-02-14T06:58:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.