Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks
- URL: http://arxiv.org/abs/2403.03157v1
- Date: Tue, 5 Mar 2024 17:49:09 GMT
- Title: Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks
- Authors: Yushen Lin, Kaidi Wang and Zhiguo Ding
- Abstract summary: This study explores the benefits of integrating the novel clustered federated learning (CFL) approach with non-independent and identically distributed (non-IID) datasets.
A detailed theoretical analysis of the generalization gap that measures the degree of non-IID in the data distribution is presented.
Solutions to address the challenges posed by non-IID conditions are proposed with the analysis of the properties.
- Score: 60.09912912343705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study explores the benefits of integrating the novel clustered federated
learning (CFL) approach with non-orthogonal multiple access (NOMA) under
non-independent and identically distributed (non-IID) datasets, where multiple
devices participate in the aggregation with time limitations and a finite
number of sub-channels. A detailed theoretical analysis of the generalization
gap that measures the degree of non-IID in the data distribution is presented.
Following that, solutions to address the challenges posed by non-IID conditions
are proposed with the analysis of the properties. Specifically, users' data
distributions are parameterized as concentration parameters and grouped using
spectral clustering, with Dirichlet distribution serving as the prior. The
investigation into the generalization gap and convergence rate guides the
design of sub-channel assignments through the matching-based algorithm, and the
power allocation is achieved by Karush-Kuhn-Tucker (KKT) conditions with the
derived closed-form solution. The extensive simulation results show that the
proposed cluster-based FL framework can outperform FL baselines in terms of
both test accuracy and convergence rate. Moreover, jointly optimizing
sub-channel and power allocation in NOMA-enhanced networks can lead to a
significant improvement.
Related papers
- Boosting the Performance of Decentralized Federated Learning via Catalyst Acceleration [66.43954501171292]
We introduce Catalyst Acceleration and propose an acceleration Decentralized Federated Learning algorithm called DFedCata.
DFedCata consists of two main components: the Moreau envelope function, which addresses parameter inconsistencies, and Nesterov's extrapolation step, which accelerates the aggregation phase.
Empirically, we demonstrate the advantages of the proposed algorithm in both convergence speed and generalization performance on CIFAR10/100 with various non-iid data distributions.
arXiv Detail & Related papers (2024-10-09T06:17:16Z) - Anomaly Detection in Time Series of EDFA Pump Currents to Monitor Degeneration Processes using Fuzzy Clustering [0.0]
This article proposes a novel fuzzy clustering based anomaly detection method for pump current time series of EDFA systems.
The proposed change detection framework (CDF) strategically combines the advantages of entropy analysis (EA) and principle component analysis (PCA) with fuzzy clustering procedures.
arXiv Detail & Related papers (2024-08-12T14:23:42Z) - Dual-Segment Clustering Strategy for Hierarchical Federated Learning in Heterogeneous Wireless Environments [22.35256018841889]
Non-independent and identically distributed (Non- IID) data adversely affects federated learning (FL)
This paper proposes a novel dual-segment clustering (DSC) strategy that jointly addresses communication and data heterogeneity in FL.
The convergence analysis and experimental results show that the DSC strategy can improve the convergence rate of wireless FL.
arXiv Detail & Related papers (2024-05-15T11:46:47Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - ClusterDDPM: An EM clustering framework with Denoising Diffusion
Probabilistic Models [9.91610928326645]
Denoising diffusion probabilistic models (DDPMs) represent a new and promising class of generative models.
In this study, we introduce an innovative expectation-maximization (EM) framework for clustering using DDPMs.
In the M-step, our focus lies in learning clustering-friendly latent representations for the data by employing the conditional DDPM and matching the distribution of latent representations to the mixture of Gaussian priors.
arXiv Detail & Related papers (2023-12-13T10:04:06Z) - Validation Diagnostics for SBI algorithms based on Normalizing Flows [55.41644538483948]
This work proposes easy to interpret validation diagnostics for multi-dimensional conditional (posterior) density estimators based on NF.
It also offers theoretical guarantees based on results of local consistency.
This work should help the design of better specified models or drive the development of novel SBI-algorithms.
arXiv Detail & Related papers (2022-11-17T15:48:06Z) - Disentangled Federated Learning for Tackling Attributes Skew via
Invariant Aggregation and Diversity Transferring [104.19414150171472]
Attributes skews the current federated learning (FL) frameworks from consistent optimization directions among the clients.
We propose disentangled federated learning (DFL) to disentangle the domain-specific and cross-invariant attributes into two complementary branches.
Experiments verify that DFL facilitates FL with higher performance, better interpretability, and faster convergence rate, compared with SOTA FL methods.
arXiv Detail & Related papers (2022-06-14T13:12:12Z) - Decentralized Local Stochastic Extra-Gradient for Variational
Inequalities [125.62877849447729]
We consider distributed variational inequalities (VIs) on domains with the problem data that is heterogeneous (non-IID) and distributed across many devices.
We make a very general assumption on the computational network that covers the settings of fully decentralized calculations.
We theoretically analyze its convergence rate in the strongly-monotone, monotone, and non-monotone settings.
arXiv Detail & Related papers (2021-06-15T17:45:51Z) - Real Elliptically Skewed Distributions and Their Application to Robust
Cluster Analysis [5.137336092866906]
This article proposes a new class of Really Skewed (RESK) distributions and associated clustering algorithms.
Non-symmetrically distributed and heavy-tailed data clusters have been reported in a variety of real-world applications.
arXiv Detail & Related papers (2020-06-30T10:44:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.