Deep Learning Prediction of Beam Coherence Time for Near-FieldTeraHertz Networks
- URL: http://arxiv.org/abs/2511.01491v1
- Date: Mon, 03 Nov 2025 11:57:45 GMT
- Title: Deep Learning Prediction of Beam Coherence Time for Near-FieldTeraHertz Networks
- Authors: Irched Chafaa, E. Veronica Belmega, Giacomo Bacci,
- Abstract summary: As the number of antennas increases, beam alignment and beam tracking in mobile networks incur prohibitive overhead.<n>In this letter, we introduce a novel beam coherence time for mobile THz networks, to drastically reduce the rate of beam updates.<n>We propose a deep learning model, relying on a simple feedforward neural network with a time-dependent input, to predict the beam coherence time and adjust the beamforming on the fly with minimal overhead.
- Score: 1.9396184042930713
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Large multiple antenna arrays coupled with accurate beamforming are essential in terahertz (THz) communications to ensure link reliability. However, as the number of antennas increases, beam alignment (focusing) and beam tracking in mobile networks incur prohibitive overhead. Additionally, the near-field region expands both with the size of antenna arrays and the carrier frequency, calling for adjustments in the beamforming to account for spherical wavefront instead of the conventional planar wave assumption. In this letter, we introduce a novel beam coherence time for mobile THz networks, to drastically reduce the rate of beam updates. Then, we propose a deep learning model, relying on a simple feedforward neural network with a time-dependent input, to predict the beam coherence time and adjust the beamforming on the fly with minimal overhead. Our numerical results demonstrate the effectiveness of the proposed approach by enabling higher data rates while reducing the overhead, especially at high (i.e., vehicular) mobility.
Related papers
- Beam Prediction based on Large Language Models [51.45077318268427]
We formulate the millimeter wave (mmWave) beam prediction problem as a time series forecasting task.<n>We transform historical observations into text-based representations using a trainable tokenizer.<n>Our method harnesses the power of LLMs to predict future optimal beams.
arXiv Detail & Related papers (2024-08-16T12:40:01Z) - Near-field Beam training for Extremely Large-scale MIMO Based on Deep Learning [20.67122533341949]
We propose a near-field beam training method based on deep learning.
We use a convolutional neural network (CNN) to efficiently learn channel characteristics from historical data.
The proposed scheme achieves a more stable beamforming gain and significantly improves performance compared to the traditional beam training method.
arXiv Detail & Related papers (2024-06-05T13:26:25Z) - Deep Learning and Image Super-Resolution-Guided Beam and Power
Allocation for mmWave Networks [80.37827344656048]
We develop a deep learning (DL)-guided hybrid beam and power allocation approach for millimeter-wave (mmWave) networks.
We exploit the synergy of supervised learning and super-resolution technology to enable low-overhead beam- and power allocation.
arXiv Detail & Related papers (2023-05-08T05:40:54Z) - Reliable Beamforming at Terahertz Bands: Are Causal Representations the
Way Forward? [85.06664206117088]
Multi-user wireless systems can meet metaverse requirements by utilizing terahertz bandwidth with massive number of antennas.
Existing solutions lack proper modeling of channel dynamics, resulting in inaccurate beamforming solutions in high-mobility scenarios.
Herein, a dynamic, semantically aware beamforming solution is proposed for the first time, utilizing novel artificial intelligence algorithms in variational causal inference.
arXiv Detail & Related papers (2023-03-14T16:02:46Z) - Fast Beam Alignment via Pure Exploration in Multi-armed Bandits [91.11360914335384]
We develop a bandit-based fast BA algorithm to reduce BA latency for millimeter-wave (mmWave) communications.
Our algorithm is named Two-Phase Heteroscedastic Track-and-Stop (2PHT&S)
arXiv Detail & Related papers (2022-10-23T05:57:39Z) - Federated Learning for THz Channel Estimation [44.058714794775995]
This paper addresses two major challenges in terahertz (THz) channel estimation: the beam-split phenomenon and computational complexity.
Data-driven techniques are known to mitigate the complexity of this problem but usually require the transmission of the datasets from the users to a central server.
In this work, we employ federated learning (FL), wherein the users transmit only the model parameters instead of the whole dataset.
arXiv Detail & Related papers (2022-07-13T07:57:25Z) - Three-Way Deep Neural Network for Radio Frequency Map Generation and
Source Localization [67.93423427193055]
Monitoring wireless spectrum over spatial, temporal, and frequency domains will become a critical feature in beyond-5G and 6G communication technologies.
In this paper, we present a Generative Adversarial Network (GAN) machine learning model to interpolate irregularly distributed measurements across the spatial domain.
arXiv Detail & Related papers (2021-11-23T22:25:10Z) - Terahertz-Band Joint Ultra-Massive MIMO Radar-Communications:
Model-Based and Model-Free Hybrid Beamforming [45.257328085051974]
Wireless communications and sensing at terahertz (THz) band are investigated as promising short-range technologies.
Ultra-massive multiple-input multiple-output (UM-MIMO) antenna systems have been proposed for THz communications to compensate propagation losses.
We develop THz hybrid beamformers based on both model-based and model-free techniques for a new group-of-subarrays (GoSA) UM-MIMO structure.
arXiv Detail & Related papers (2021-02-27T21:28:34Z) - Learning to Beamform in Heterogeneous Massive MIMO Networks [48.62625893368218]
It is well-known problem of finding the optimal beamformers in massive multiple-input multiple-output (MIMO) networks.
We propose a novel deep learning based paper algorithm to address this problem.
arXiv Detail & Related papers (2020-11-08T12:48:06Z) - Deep Learning Based Antenna Selection for Channel Extrapolation in FDD
Massive MIMO [54.54508321463112]
In massive multiple-input multiple-output (MIMO) systems, the large number of antennas would bring a great challenge for the acquisition of the accurate channel state information.
We utilize the neural networks (NNs) to capture the inherent connection between the uplink and downlink channel data sets and extrapolate the downlink channels from a subset of the uplink channel state information.
We study the antenna subset selection problem in order to achieve the best channel extrapolation and decrease the data size of NNs.
arXiv Detail & Related papers (2020-09-03T13:38:52Z) - Fast Initial Access with Deep Learning for Beam Prediction in 5G mmWave
Networks [7.879958190837517]
DeepIA is a deep learning solution for faster and more accurate initial access (IA) in 5G millimeter wave (mmWave) networks.
We show that DeepIA reduces the IA time by sweeping fewer beams and significantly outperforms the conventional IA's beam prediction accuracy in both line of sight (LoS) and non-line of sight (NLoS) mmWave channel conditions.
arXiv Detail & Related papers (2020-06-22T22:35:17Z) - Beamforming Learning for mmWave Communication: Theory and Experimental
Validation [23.17604790640996]
We propose a beam design technique that reduces the search time and does not require CSI while guaranteeing a minimum beamforming gain.
We evaluate the efficacy of the proposed scheme in terms of building the codebook and assessing its performance through real-life measurements.
arXiv Detail & Related papers (2019-12-28T05:46:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.