A Signal Detection Scheme Based on Deep Learning in OFDM Systems
- URL: http://arxiv.org/abs/2107.13423v1
- Date: Sat, 24 Jul 2021 04:25:46 GMT
- Title: A Signal Detection Scheme Based on Deep Learning in OFDM Systems
- Authors: Guangliang Pan, Zitong Liu, Wei Wang, Minglei Li
- Abstract summary: We develop a Data-driven Deep Learning for Signal Detection in OFDM systems.
We show that the DDLSD scheme outperforms the existing traditional methods in terms of improving channel estimation and signal detection performance.
- Score: 5.260367962320027
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Channel estimation and signal detection are essential steps to ensure the
quality of end-to-end communication in orthogonal frequency-division
multiplexing (OFDM) systems. In this paper, we develop a DDLSD approach, i.e.,
Data-driven Deep Learning for Signal Detection in OFDM systems. First, the OFDM
system model is established. Then, the long short-term memory (LSTM) is
introduced into the OFDM system model. Wireless channel data is generated
through simulation, the preprocessed time series feature information is input
into the LSTM to complete the offline training. Finally, the trained model is
used for online recovery of transmitted signal. The difference between this
scheme and existing OFDM receiver is that explicit estimated channel state
information (CSI) is transformed into invisible estimated CSI, and the transmit
symbol is directly restored. Simulation results show that the DDLSD scheme
outperforms the existing traditional methods in terms of improving channel
estimation and signal detection performance.
Related papers
- An ML-assisted OTFS vs. OFDM adaptable modem [1.8492669447784602]
OTFS and OFDM waveforms enjoy the benefits of the reuse of legacy architectures, simplicity of receiver design, and low-complexity detection.
We propose a deep neural network (DNN)-based adaptation scheme to switch between using either an OTFS or OFDM signal processing chain at the transmitter and receiver for optimal mean-squared-error (MSE) performance.
arXiv Detail & Related papers (2023-09-04T02:33:44Z) - An Efficient Machine Learning-based Channel Prediction Technique for
OFDM Sub-Bands [0.0]
We propose an efficient machine learning (ML)-based technique for channel prediction in OFDM sub-bands.
The novelty of the proposed approach lies in the training of channel fading samples used to estimate future channel behaviour in selective fading.
arXiv Detail & Related papers (2023-05-31T09:41:27Z) - On Neural Architectures for Deep Learning-based Source Separation of
Co-Channel OFDM Signals [104.11663769306566]
We study the single-channel source separation problem involving frequency-division multiplexing (OFDM) signals.
We propose critical domain-informed modifications to the network parameterization, based on insights from OFDM structures.
arXiv Detail & Related papers (2023-03-11T16:29:13Z) - Model-based Deep Learning Receiver Design for Rate-Splitting Multiple
Access [65.21117658030235]
This work proposes a novel design for a practical RSMA receiver based on model-based deep learning (MBDL) methods.
The MBDL receiver is evaluated in terms of uncoded Symbol Error Rate (SER), throughput performance through Link-Level Simulations (LLS) and average training overhead.
Results reveal that the MBDL outperforms by a significant margin the SIC receiver with imperfect CSIR.
arXiv Detail & Related papers (2022-05-02T12:23:55Z) - MIMO Channel Estimation using Score-Based Generative Models [1.6752182911522517]
We introduce a novel approach for channel estimation using deep score-based generative models.
These models are trained to estimate the gradient of the log-prior distribution, and can be used to iteratively refine estimates, given observed measurements of a signal.
arXiv Detail & Related papers (2022-04-14T17:23:58Z) - Channel Estimation Based on Machine Learning Paradigm for Spatial
Modulation OFDM [0.0]
Deep neural network (DNN) is integrated with spatial modulation-orthogonal frequency division multiplexing (SM-OFDM) technique for end-to-end data detection over Rayleigh fading channel.
This proposed system directly demodulates the received symbols, leaving the channel estimation done only implicitly.
arXiv Detail & Related papers (2021-09-15T10:54:56Z) - Model-Driven Deep Learning Based Channel Estimation and Feedback for
Millimeter-Wave Massive Hybrid MIMO Systems [61.78590389147475]
This paper proposes a model-driven deep learning (MDDL)-based channel estimation and feedback scheme for millimeter-wave (mmWave) systems.
To reduce the uplink pilot overhead for estimating the high-dimensional channels from a limited number of radio frequency (RF) chains, we propose to jointly train the phase shift network and the channel estimator as an auto-encoder.
Numerical results show that the proposed MDDL-based channel estimation and feedback scheme outperforms the state-of-the-art approaches.
arXiv Detail & Related papers (2021-04-22T13:34:53Z) - LoRD-Net: Unfolded Deep Detection Network with Low-Resolution Receivers [104.01415343139901]
We propose a deep detector entitled LoRD-Net for recovering information symbols from one-bit measurements.
LoRD-Net has a task-based architecture dedicated to recovering the underlying signal of interest.
We evaluate the proposed receiver architecture for one-bit signal recovery in wireless communications.
arXiv Detail & Related papers (2021-02-05T04:26:05Z) - Deep Joint Source Channel Coding for WirelessImage Transmission with
OFDM [6.799021090790035]
The proposed encoder and decoder use convolutional neural networks (CNN) and directly map the source images to complex-valued baseband samples.
The proposed model-driven machine learning approach eliminates the need for separate source and channel coding.
Our method is shown to be robust against non-linear signal clipping in OFDM for various channel conditions.
arXiv Detail & Related papers (2021-01-05T22:27:20Z) - Data-Driven Symbol Detection via Model-Based Machine Learning [117.58188185409904]
We review a data-driven framework to symbol detection design which combines machine learning (ML) and model-based algorithms.
In this hybrid approach, well-known channel-model-based algorithms are augmented with ML-based algorithms to remove their channel-model-dependence.
Our results demonstrate that these techniques can yield near-optimal performance of model-based algorithms without knowing the exact channel input-output statistical relationship.
arXiv Detail & Related papers (2020-02-14T06:58:27Z) - DeepSIC: Deep Soft Interference Cancellation for Multiuser MIMO
Detection [98.43451011898212]
In multiuser multiple-input multiple-output (MIMO) setups, where multiple symbols are simultaneously transmitted, accurate symbol detection is challenging.
We propose a data-driven implementation of the iterative soft interference cancellation (SIC) algorithm which we refer to as DeepSIC.
DeepSIC learns to carry out joint detection from a limited set of training samples without requiring the channel to be linear.
arXiv Detail & Related papers (2020-02-08T18:31:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.