Federated Learning for Hybrid Beamforming in mm-Wave Massive MIMO
- URL: http://arxiv.org/abs/2005.09969v3
- Date: Sat, 22 Aug 2020 08:09:03 GMT
- Title: Federated Learning for Hybrid Beamforming in mm-Wave Massive MIMO
- Authors: Ahmet M. Elbir and Sinem Coleri
- Abstract summary: We introduce a federated learning (FL) based framework for hybrid beamforming, where the model training is performed at the base station.
We design a convolutional neural network, in which the input is the channel data, yielding the analog beamformers at the output.
FL is demonstrated to be more tolerant to the imperfections and corruptions in the channel data as well as having less transmission overhead than CML.
- Score: 12.487990897680422
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning for hybrid beamforming has been extensively studied by using
centralized machine learning (CML) techniques, which require the training of a
global model with a large dataset collected from the users. However, the
transmission of the whole dataset between the users and the base station (BS)
is computationally prohibitive due to limited communication bandwidth and
privacy concerns. In this work, we introduce a federated learning (FL) based
framework for hybrid beamforming, where the model training is performed at the
BS by collecting only the gradients from the users. We design a convolutional
neural network, in which the input is the channel data, yielding the analog
beamformers at the output. Via numerical simulations, FL is demonstrated to be
more tolerant to the imperfections and corruptions in the channel data as well
as having less transmission overhead than CML.
Related papers
- Semi-Federated Learning: Convergence Analysis and Optimization of A
Hybrid Learning Framework [70.83511997272457]
We propose a semi-federated learning (SemiFL) paradigm to leverage both the base station (BS) and devices for a hybrid implementation of centralized learning (CL) and FL.
We propose a two-stage algorithm to solve this intractable problem, in which we provide the closed-form solutions to the beamformers.
arXiv Detail & Related papers (2023-10-04T03:32:39Z) - In Situ Framework for Coupling Simulation and Machine Learning with
Application to CFD [51.04126395480625]
Recent years have seen many successful applications of machine learning (ML) to facilitate fluid dynamic computations.
As simulations grow, generating new training datasets for traditional offline learning creates I/O and storage bottlenecks.
This work offers a solution by simplifying this coupling and enabling in situ training and inference on heterogeneous clusters.
arXiv Detail & Related papers (2023-06-22T14:07:54Z) - Joint Superposition Coding and Training for Federated Learning over
Multi-Width Neural Networks [52.93232352968347]
This paper aims to integrate two synergetic technologies, federated learning (FL) and width-adjustable slimmable neural network (SNN)
FL preserves data privacy by exchanging the locally trained models of mobile devices. SNNs are however non-trivial, particularly under wireless connections with time-varying channel conditions.
We propose a communication and energy-efficient SNN-based FL (named SlimFL) that jointly utilizes superposition coding (SC) for global model aggregation and superposition training (ST) for updating local models.
arXiv Detail & Related papers (2021-12-05T11:17:17Z) - Federated Learning for Physical Layer Design [38.46522285374866]
Federated learning (FL) has been proposed recently as a distributed learning scheme.
FL is more communication-efficient and privacy-preserving than centralized learning (CL)
This article discusses the recent advances in FL-based training for physical layer design problems.
arXiv Detail & Related papers (2021-02-23T16:22:53Z) - Federated Dropout Learning for Hybrid Beamforming With Spatial Path
Index Modulation In Multi-User mmWave-MIMO Systems [19.10321102094638]
We introduce model-based and model-free frameworks for beamformer design in SPIM-MIMO systems.
The proposed framework exhibits higher spectral efficiency than the state-of-the-art SPIM-MIMO methods and mmWave-MIMO.
arXiv Detail & Related papers (2021-02-15T10:49:26Z) - Bayesian Federated Learning over Wireless Networks [87.37301441859925]
Federated learning is a privacy-preserving and distributed training method using heterogeneous data sets stored at local devices.
This paper presents an efficient modified BFL algorithm called scalableBFL (SBFL)
arXiv Detail & Related papers (2020-12-31T07:32:44Z) - Federated Learning for Channel Estimation in Conventional and
RIS-Assisted Massive MIMO [12.487990897680422]
Channel estimation via machine learning requires model training on a dataset, which usually includes the received pilot signals as input and channel data as output.
In previous works, model training is mostly done via centralized learning (CL), where the whole training dataset is collected from the users at the base station (BS)
We propose a federated learning (FL) framework for channel estimation. We design a convolutional neural network (CNN) trained on the local datasets of the users without sending them to the BS.
We evaluate the performance for noisy and quantized model transmission and show that the proposed approach provides approximately 16 times lower overhead than CL
arXiv Detail & Related papers (2020-08-25T06:51:18Z) - Distillation-Based Semi-Supervised Federated Learning for
Communication-Efficient Collaborative Training with Non-IID Private Data [8.935169114460663]
This study develops a federated learning (FL) framework overcoming largely incremental communication costs.
We propose a distillation-based semi-supervised FL algorithm that exchanges the outputs of local models among mobile devices.
In DS-FL, the communication cost depends only on the output dimensions of the models and does not scale up according to the model size.
arXiv Detail & Related papers (2020-08-14T03:47:27Z) - UVeQFed: Universal Vector Quantization for Federated Learning [179.06583469293386]
Federated learning (FL) is an emerging approach to train such learning models without requiring the users to share their possibly private labeled data.
In FL, each user trains its copy of the learning model locally. The server then collects the individual updates and aggregates them into a global model.
We show that combining universal vector quantization methods with FL yields a decentralized training system in which the compression of the trained models induces only a minimum distortion.
arXiv Detail & Related papers (2020-06-05T07:10:22Z) - A Compressive Sensing Approach for Federated Learning over Massive MIMO
Communication Systems [82.2513703281725]
Federated learning is a privacy-preserving approach to train a global model at a central server by collaborating with wireless devices.
We present a compressive sensing approach for federated learning over massive multiple-input multiple-output communication systems.
arXiv Detail & Related papers (2020-03-18T05:56:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.