Fast Initial Access with Deep Learning for Beam Prediction in 5G mmWave
Networks
- URL: http://arxiv.org/abs/2006.12653v1
- Date: Mon, 22 Jun 2020 22:35:17 GMT
- Title: Fast Initial Access with Deep Learning for Beam Prediction in 5G mmWave
Networks
- Authors: Tarun S. Cousik, Vijay K. Shah, Jeffrey H. Reed, Tugba Erpek, Yalin E.
Sagduyu
- Abstract summary: DeepIA is a deep learning solution for faster and more accurate initial access (IA) in 5G millimeter wave (mmWave) networks.
We show that DeepIA reduces the IA time by sweeping fewer beams and significantly outperforms the conventional IA's beam prediction accuracy in both line of sight (LoS) and non-line of sight (NLoS) mmWave channel conditions.
- Score: 7.879958190837517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents DeepIA, a deep learning solution for faster and more
accurate initial access (IA) in 5G millimeter wave (mmWave) networks when
compared to conventional IA. By utilizing a subset of beams in the IA process,
DeepIA removes the need for an exhaustive beam search thereby reducing the beam
sweep time in IA. A deep neural network (DNN) is trained to learn the complex
mapping from the received signal strengths (RSSs) collected with a reduced
number of beams to the optimal spatial beam of the receiver (among a larger set
of beams). In test time, DeepIA measures RSSs only from a small number of beams
and runs the DNN to predict the best beam for IA. We show that DeepIA reduces
the IA time by sweeping fewer beams and significantly outperforms the
conventional IA's beam prediction accuracy in both line of sight (LoS) and
non-line of sight (NLoS) mmWave channel conditions.
Related papers
- Near-field Beam training for Extremely Large-scale MIMO Based on Deep Learning [20.67122533341949]
We propose a near-field beam training method based on deep learning.
We use a convolutional neural network (CNN) to efficiently learn channel characteristics from historical data.
The proposed scheme achieves a more stable beamforming gain and significantly improves performance compared to the traditional beam training method.
arXiv Detail & Related papers (2024-06-05T13:26:25Z) - Deep Learning and Image Super-Resolution-Guided Beam and Power
Allocation for mmWave Networks [80.37827344656048]
We develop a deep learning (DL)-guided hybrid beam and power allocation approach for millimeter-wave (mmWave) networks.
We exploit the synergy of supervised learning and super-resolution technology to enable low-overhead beam- and power allocation.
arXiv Detail & Related papers (2023-05-08T05:40:54Z) - UB3: Best Beam Identification in Millimeter Wave Systems via Pure
Exploration Unimodal Bandits [7.253481390411171]
We develop an algorithm that exploits the unimodal structure of the received signal strengths of the beams to identify the best beam in a finite time.
Our algorithm is named Unimodal Bandit for Best Beam (UB3) and identifies the best beam with a high probability in a few rounds.
arXiv Detail & Related papers (2022-12-26T09:24:22Z) - Fast Beam Alignment via Pure Exploration in Multi-armed Bandits [91.11360914335384]
We develop a bandit-based fast BA algorithm to reduce BA latency for millimeter-wave (mmWave) communications.
Our algorithm is named Two-Phase Heteroscedastic Track-and-Stop (2PHT&S)
arXiv Detail & Related papers (2022-10-23T05:57:39Z) - Learning to Estimate RIS-Aided mmWave Channels [50.15279409856091]
We focus on uplink cascaded channel estimation, where known and fixed base station combining and RIS phase control matrices are considered for collecting observations.
To boost the estimation performance and reduce the training overhead, the inherent channel sparsity of mmWave channels is leveraged in the deep unfolding method.
It is verified that the proposed deep unfolding network architecture can outperform the least squares (LS) method with a relatively smaller training overhead and online computational complexity.
arXiv Detail & Related papers (2021-07-27T06:57:56Z) - Adversarial Attacks on Deep Learning Based mmWave Beam Prediction in 5G
and Beyond [46.34482158291128]
A deep neural network (DNN) can predict the beam that is best slanted to each UE by using the received signal strengths ( RSSs) from a subset of possible narrow beams.
We present an adversarial attack by generating perturbations to manipulate the over-the-air captured RSSs as the input to the DNN.
This attack reduces the IA performance significantly and fools the DNN into choosing the beams with small RSSs compared to jamming attacks with Gaussian or uniform noise.
arXiv Detail & Related papers (2021-03-25T17:25:21Z) - Deep Learning for Fast and Reliable Initial Access in AI-Driven 6G
mmWave Networks [6.097649192976533]
We present DeepIA, a framework for enabling fast and reliable initial access for AI-driven beyond 5G and 6G millimeter (mmWave) networks.
DeepIA reduces the beam sweep time compared to a conventional exhaustive search-based IA process by utilizing only a subset of the available beams.
We show that the beam prediction accuracy of DeepIA saturates with the number of beams used for IA and depends on the particular selection of the beams.
arXiv Detail & Related papers (2021-01-06T02:59:49Z) - CodeVIO: Visual-Inertial Odometry with Learned Optimizable Dense Depth [83.77839773394106]
We present a lightweight, tightly-coupled deep depth network and visual-inertial odometry system.
We provide the network with previously marginalized sparse features from VIO to increase the accuracy of initial depth prediction.
We show that it can run in real-time with single-thread execution while utilizing GPU acceleration only for the network and code Jacobian.
arXiv Detail & Related papers (2020-12-18T09:42:54Z) - StrObe: Streaming Object Detection from LiDAR Packets [73.27333924964306]
Rolling shutter LiDARs emitted as a stream of packets, each covering a sector of the 360deg coverage.
Modern perception algorithms wait for the full sweep to be built before processing the data, which introduces an additional latency.
In this paper we propose StrObe, a novel approach that minimizes latency by ingesting LiDAR packets and emitting a stream of detections without waiting for the full sweep to be built.
arXiv Detail & Related papers (2020-11-12T14:57:44Z) - Beamforming Learning for mmWave Communication: Theory and Experimental
Validation [23.17604790640996]
We propose a beam design technique that reduces the search time and does not require CSI while guaranteeing a minimum beamforming gain.
We evaluate the efficacy of the proposed scheme in terms of building the codebook and assessing its performance through real-life measurements.
arXiv Detail & Related papers (2019-12-28T05:46:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.