Clustered Federated Learning based on Nonconvex Pairwise Fusion
- URL: http://arxiv.org/abs/2211.04218v3
- Date: Sun, 24 Dec 2023 09:29:08 GMT
- Title: Clustered Federated Learning based on Nonconvex Pairwise Fusion
- Authors: Xue Yu, Ziyi Liu, Wu Wang and Yifan Sun
- Abstract summary: We introduce a novel clustered FL setting called Fusion Clustering (FPFC)
FPFC can perform partial updates at each communication allows parallel computation with variable workload.
We also propose a new practical strategy for FLFC with general losses and robustness.
- Score: 22.82565500426576
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This study investigates clustered federated learning (FL), one of the
formulations of FL with non-i.i.d. data, where the devices are partitioned into
clusters and each cluster optimally fits its data with a localized model. We
propose a clustered FL framework that incorporates a nonconvex penalty to
pairwise differences of parameters. Without a priori knowledge of the set of
devices in each cluster and the number of clusters, this framework can
autonomously estimate cluster structures. To implement the proposed framework,
we introduce a novel clustered FL method called Fusion Penalized Federated
Clustering (FPFC). Building upon the standard alternating direction method of
multipliers (ADMM), FPFC can perform partial updates at each communication
round and allows parallel computation with variable workload. These strategies
significantly reduce the communication cost while ensuring privacy, making it
practical for FL. We also propose a new warmup strategy for hyperparameter
tuning in FL settings and explore the asynchronous variant of FPFC (asyncFPFC).
Theoretical analysis provides convergence guarantees for FPFC with general
losses and establishes the statistical convergence rate under a linear model
with squared loss. Extensive experiments have demonstrated the superiority of
FPFC compared to current methods, including robustness and generalization
capability.
Related papers
- Interaction-Aware Gaussian Weighting for Clustered Federated Learning [58.92159838586751]
Federated Learning (FL) emerged as a decentralized paradigm to train models while preserving privacy.
We propose a novel clustered FL method, FedGWC (Federated Gaussian Weighting Clustering), which groups clients based on their data distribution.
Our experiments on benchmark datasets show that FedGWC outperforms existing FL algorithms in cluster quality and classification accuracy.
arXiv Detail & Related papers (2025-02-05T16:33:36Z) - Over-the-Air Fair Federated Learning via Multi-Objective Optimization [52.295563400314094]
We propose an over-the-air fair federated learning algorithm (OTA-FFL) to train fair FL models.
Experiments demonstrate the superiority of OTA-FFL in achieving fairness and robust performance.
arXiv Detail & Related papers (2025-01-06T21:16:51Z) - LCFed: An Efficient Clustered Federated Learning Framework for Heterogeneous Data [21.341280782748278]
Clustered federated learning (CFL) addresses the performance challenges posed by data heterogeneity in federated learning (FL)
Existing CFL approaches strictly limit knowledge sharing to within clusters, lacking the integration of global knowledge with intra-cluster training.
We propose LCFed, an efficient CFL framework to combat these challenges.
arXiv Detail & Related papers (2025-01-03T14:59:48Z) - Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks [60.09912912343705]
This study explores the benefits of integrating the novel clustered federated learning (CFL) approach with non-independent and identically distributed (non-IID) datasets.
A detailed theoretical analysis of the generalization gap that measures the degree of non-IID in the data distribution is presented.
Solutions to address the challenges posed by non-IID conditions are proposed with the analysis of the properties.
arXiv Detail & Related papers (2024-03-05T17:49:09Z) - Hierarchical Federated Learning in Multi-hop Cluster-Based VANETs [12.023861154677205]
This paper introduces a novel framework for hierarchical federated learning (HFL) over multi-hop clustering-based VANET.
The proposed method utilizes a weighted combination of the average relative speed and cosine similarity of FL model parameters as a clustering metric.
Through extensive simulations, the proposed hierarchical federated learning over clustered VANET has been demonstrated to improve accuracy and convergence time significantly.
arXiv Detail & Related papers (2024-01-18T20:05:34Z) - A Joint Gradient and Loss Based Clustered Federated Learning Design [26.54703150478879]
A novel clustered FL framework that enables distributed edge devices with non-IID data to independently form several clusters is proposed.
By delegating clustering decisions to edge devices, each device can fully leverage its private data information to determine its own cluster identity.
Simulation results demonstrate that our proposed clustered FL algorithm can reduce clustering iterations by up to 99% compared to the existing baseline.
arXiv Detail & Related papers (2023-11-22T19:39:37Z) - Find Your Optimal Assignments On-the-fly: A Holistic Framework for
Clustered Federated Learning [5.045379017315639]
Federated Learning (FL) is an emerging distributed machine learning approach that preserves client privacy by storing data on edge devices.
Recent studies have proposed clustering as a solution to tackle client heterogeneity in FL by grouping clients with distribution shifts into different clusters.
This paper presents a comprehensive investigation into current clustered FL methods and proposes a four-tier framework to encompass and extend existing approaches.
arXiv Detail & Related papers (2023-10-09T04:23:11Z) - Disentangled Federated Learning for Tackling Attributes Skew via
Invariant Aggregation and Diversity Transferring [104.19414150171472]
Attributes skews the current federated learning (FL) frameworks from consistent optimization directions among the clients.
We propose disentangled federated learning (DFL) to disentangle the domain-specific and cross-invariant attributes into two complementary branches.
Experiments verify that DFL facilitates FL with higher performance, better interpretability, and faster convergence rate, compared with SOTA FL methods.
arXiv Detail & Related papers (2022-06-14T13:12:12Z) - Heterogeneous Federated Learning via Grouped Sequential-to-Parallel
Training [60.892342868936865]
Federated learning (FL) is a rapidly growing privacy-preserving collaborative machine learning paradigm.
We propose a data heterogeneous-robust FL approach, FedGSP, to address this challenge.
We show that FedGSP improves the accuracy by 3.7% on average compared with seven state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-31T03:15:28Z) - Device Scheduling and Update Aggregation Policies for Asynchronous
Federated Learning [72.78668894576515]
Federated Learning (FL) is a newly emerged decentralized machine learning (ML) framework.
We propose an asynchronous FL framework with periodic aggregation to eliminate the straggler issue in FL systems.
arXiv Detail & Related papers (2021-07-23T18:57:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.