Remote Detection of Applications for Improved Beam Tracking in mmWave/sub-THz 5G/6G Systems
- URL: http://arxiv.org/abs/2410.18637v1
- Date: Thu, 24 Oct 2024 10:55:21 GMT
- Title: Remote Detection of Applications for Improved Beam Tracking in mmWave/sub-THz 5G/6G Systems
- Authors: Alexander Shurakov, Margarita Ershova, Abdukodir Khakimov, Anatoliy Prikhodko, Evgeny Mokrov, Vyacheslav Begishev, Galina Chulkova, Yevgeni Koucheryavy, Gregory Gol'tsman,
- Abstract summary: Beam tracking is an essential functionality of millimeter wave (mmWave, 30-100 GHz) and sub-terahertz (sub-THz, 100-300 GHz) 5G/6G systems.
It operates by performing antenna sweeping at both base station (BS) and user equipment (UE) sides.
In absence of explicit signalling for the type of application at the air interface, in this paper, we propose a way to remotely detect it at the BS side based on the received signal strength pattern.
- Score: 37.35086075012511
- License:
- Abstract: Beam tracking is an essential functionality of millimeter wave (mmWave, 30-100 GHz) and sub-terahertz (sub-THz, 100-300 GHz) 5G/6G systems. It operates by performing antenna sweeping at both base station (BS) and user equipment (UE) sides using the Synchronization Signal Blocks (SSB). The optimal frequency of beam tracking events is not specified by 3GPP standards and heavily depends on the micromobility properties of the applications currently utilized by the user. In absence of explicit signalling for the type of application at the air interface, in this paper, we propose a way to remotely detect it at the BS side based on the received signal strength pattern. To this aim, we first perform a multi-stage measurement campaign at 156 GHz, belonging to the sub-THz band, to obtain the received signal strength traces of popular smartphone applications. Then, we proceed applying conventional statistical Mann-Whitney tests and various machine learning (ML) based classification techniques to discriminate applications remotely. Our results show that Mann-Whitney test can be used to differentiate between fast and slow application classes with a confidence of 0.95 inducing class detection delay on the order of 1 s after application initialization. With the same time budget, random forest classifiers can differentiate between applications with fast and slow micromobility with 80% accuracy using received signal strength metric only. The accuracy of detecting a specific application however is lower, reaching 60%. By utilizing the proposed technique one can estimate the optimal values of the beam tracking intervals without adding additional signalling to the air interface.
Related papers
- Anomaly Detection and RFI Classification with Unsupervised Learning in Narrowband Radio Technosignature Searches [0.0]
We present GLOBULAR clustering, a signal processing method that uses HDBSCAN to reduce the false-positive rate and isolate outlier signals.
When combined with a standard narrowband signal detection and spatial filtering pipeline, GLOBULAR clustering offers significant improvements in the false-positive rate.
We benchmark our method against the Choza et al. (2024) turboSETI-only search of 97 nearby galaxies at L-band, demonstrating a false-positive hit reduction rate of 93.1% and a false-positive event reduction rate of 99.3%.
arXiv Detail & Related papers (2024-11-25T16:40:19Z) - What If We Had Used a Different App? Reliable Counterfactual KPI Analysis in Wireless Systems [52.499838151272016]
This paper addresses the "what-if" problem of estimating the values of key performance indicators (KPIs) that would have been obtained if a different app had been implemented by the radio access network (RAN)
We propose a conformal-prediction-based counterfactual analysis method for wireless systems that provides reliable "error bars" for the estimated, containing the true with a user-defined probability.
arXiv Detail & Related papers (2024-09-30T18:47:26Z) - Gesture Recognition with mmWave Wi-Fi Access Points: Lessons Learned [3.5711957833616235]
We explore mmWave (60 GHz) Wi-Fi signals for gesture recognition/pose estimation.
For this reason, we extract beam signal-to-noise ratios (SNRs) from periodic beam training employed by IEEE 802.11ad devices.
A deep neural network (DNN) achieves promising results on the beam SNR task with state-of-the-art 96.7% accuracy in a single environment.
arXiv Detail & Related papers (2023-06-29T16:10:07Z) - Faster Region-Based CNN Spectrum Sensing and Signal Identification in
Cluttered RF Environments [0.7734726150561088]
We optimize a faster region-based convolutional neural network (FRCNN) for 1-dimensional (1D) signal processing and electromagnetic spectrum sensing.
Results show that our method has better localization performance, and is faster than the 2D equivalent.
arXiv Detail & Related papers (2023-02-20T09:35:13Z) - Decision Forest Based EMG Signal Classification with Low Volume Dataset
Augmented with Random Variance Gaussian Noise [51.76329821186873]
We produce a model that can classify six different hand gestures with a limited number of samples that generalizes well to a wider audience.
We appeal to a set of more elementary methods such as the use of random bounds on a signal, but desire to show the power these methods can carry in an online setting.
arXiv Detail & Related papers (2022-06-29T23:22:18Z) - Low Latency Real-Time Seizure Detection Using Transfer Deep Learning [0.0]
Scalp electroencephalogram (EEG) signals inherently have a low signal-to-noise ratio.
Most popular approaches to seizure detection using deep learning do not jointly model this information or require multiple passes over the signal.
In this paper, we exploit both simultaneously by converting the multichannel signal to a grayscale image and using transfer learning to achieve high performance.
arXiv Detail & Related papers (2022-02-16T00:03:00Z) - Real-Time GPU-Accelerated Machine Learning Based Multiuser Detection for
5G and Beyond [70.81551587109833]
nonlinear beamforming filters can significantly outperform linear approaches in stationary scenarios with massive connectivity.
One of the main challenges comes from the real-time implementation of these algorithms.
This paper explores the acceleration of APSM-based algorithms through massive parallelization.
arXiv Detail & Related papers (2022-01-13T15:20:45Z) - Deep Learning Based Hybrid Precoding in Dual-Band Communication Systems [34.03893373401685]
We propose a deep learning-based method that uses spatial and temporal information extracted from the sub-6GHz band to predict/track beams in the millimeter-wave (mmWave) band.
We consider a dual-band communication system operating in both the sub-6GHz and mmWave bands.
arXiv Detail & Related papers (2021-07-16T12:10:32Z) - Optimal Sequential Detection of Signals with Unknown Appearance and
Disappearance Points in Time [64.26593350748401]
The paper addresses a sequential changepoint detection problem, assuming that the duration of change may be finite and unknown.
We focus on a reliable maximin change detection criterion of maximizing the minimal probability of detection in a given time (or space) window.
The FMA algorithm is applied to detecting faint streaks of satellites in optical images.
arXiv Detail & Related papers (2021-02-02T04:58:57Z) - Detection of gravitational-wave signals from binary neutron star mergers
using machine learning [52.77024349608834]
We introduce a novel neural-network based machine learning algorithm that uses time series strain data from gravitational-wave detectors.
We find an improvement by a factor of 6 in sensitivity to signals with signal-to-noise ratio below 25.
A conservative estimate indicates that our algorithm introduces on average 10.2 s of latency between signal arrival and generating an alert.
arXiv Detail & Related papers (2020-06-02T10:20:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.