A Joint Gradient and Loss Based Clustered Federated Learning Design
- URL: http://arxiv.org/abs/2311.13665v1
- Date: Wed, 22 Nov 2023 19:39:37 GMT
- Title: A Joint Gradient and Loss Based Clustered Federated Learning Design
- Authors: Licheng Lin, Mingzhe Chen, Zhaohui Yang, Yusen Wu, Yuchen Liu
- Abstract summary: A novel clustered FL framework that enables distributed edge devices with non-IID data to independently form several clusters is proposed.
By delegating clustering decisions to edge devices, each device can fully leverage its private data information to determine its own cluster identity.
Simulation results demonstrate that our proposed clustered FL algorithm can reduce clustering iterations by up to 99% compared to the existing baseline.
- Score: 26.54703150478879
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, a novel clustered FL framework that enables distributed edge
devices with non-IID data to independently form several clusters in a
distributed manner and implement FL training within each cluster is proposed.
In particular, our designed clustered FL algorithm must overcome two challenges
associated with FL training. First, the server has limited FL training
information (i.e., the parameter server can only obtain the FL model
information of each device) and limited computational power for finding the
differences among a large amount of devices. Second, each device does not have
the data information of other devices for device clustering and can only use
global FL model parameters received from the server and its data information to
determine its cluster identity, which will increase the difficulty of device
clustering. To overcome these two challenges, we propose a joint gradient and
loss based distributed clustering method in which each device determines its
cluster identity considering the gradient similarity and training loss. The
proposed clustering method not only considers how a local FL model of one
device contributes to each cluster but also the direction of gradient descent
thus improving clustering speed. By delegating clustering decisions to edge
devices, each device can fully leverage its private data information to
determine its own cluster identity, thereby reducing clustering overhead and
improving overall clustering performance. Simulation results demonstrate that
our proposed clustered FL algorithm can reduce clustering iterations by up to
99% compared to the existing baseline.
Related papers
- Geographical Node Clustering and Grouping to Guarantee Data IIDness in Federated Learning [2.903020332386652]
A major challenge of Federated learning (FL) is the non-IID dataset problem.
This paper proposes a novel approach to ensure data IIDness by properly clustering and grouping mobile IoT nodes.
Our mechanism significantly outperforms benchmark grouping algorithms at least by 110 times in terms of the joint cost between the number of dropout devices and the evenness in per-group device count.
arXiv Detail & Related papers (2024-10-21T07:03:15Z) - Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks [60.09912912343705]
This study explores the benefits of integrating the novel clustered federated learning (CFL) approach with non-independent and identically distributed (non-IID) datasets.
A detailed theoretical analysis of the generalization gap that measures the degree of non-IID in the data distribution is presented.
Solutions to address the challenges posed by non-IID conditions are proposed with the analysis of the properties.
arXiv Detail & Related papers (2024-03-05T17:49:09Z) - Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - Hierarchical Federated Learning in Multi-hop Cluster-Based VANETs [12.023861154677205]
This paper introduces a novel framework for hierarchical federated learning (HFL) over multi-hop clustering-based VANET.
The proposed method utilizes a weighted combination of the average relative speed and cosine similarity of FL model parameters as a clustering metric.
Through extensive simulations, the proposed hierarchical federated learning over clustered VANET has been demonstrated to improve accuracy and convergence time significantly.
arXiv Detail & Related papers (2024-01-18T20:05:34Z) - Find Your Optimal Assignments On-the-fly: A Holistic Framework for
Clustered Federated Learning [5.045379017315639]
Federated Learning (FL) is an emerging distributed machine learning approach that preserves client privacy by storing data on edge devices.
Recent studies have proposed clustering as a solution to tackle client heterogeneity in FL by grouping clients with distribution shifts into different clusters.
This paper presents a comprehensive investigation into current clustered FL methods and proposes a four-tier framework to encompass and extend existing approaches.
arXiv Detail & Related papers (2023-10-09T04:23:11Z) - Adaptive Model Pruning and Personalization for Federated Learning over
Wireless Networks [72.59891661768177]
Federated learning (FL) enables distributed learning across edge devices while protecting data privacy.
We consider a FL framework with partial model pruning and personalization to overcome these challenges.
This framework splits the learning model into a global part with model pruning shared with all devices to learn data representations and a personalized part to be fine-tuned for a specific device.
arXiv Detail & Related papers (2023-09-04T21:10:45Z) - Clustered Federated Learning based on Nonconvex Pairwise Fusion [22.82565500426576]
We introduce a novel clustered FL setting called Fusion Clustering (FPFC)
FPFC can perform partial updates at each communication allows parallel computation with variable workload.
We also propose a new practical strategy for FLFC with general losses and robustness.
arXiv Detail & Related papers (2022-11-08T13:04:56Z) - An Improved Algorithm for Clustered Federated Learning [29.166363192740768]
This paper addresses the dichotomy between heterogeneous models and simultaneous training in Federated Learning (FL)
We define a new clustering model for FL based on the (optimal) local models of the users.
textttSR-FCA uses a robust learning algorithm within each cluster to exploit simultaneous training and to correct clustering errors.
arXiv Detail & Related papers (2022-10-20T19:14:36Z) - FedHiSyn: A Hierarchical Synchronous Federated Learning Framework for
Resource and Data Heterogeneity [56.82825745165945]
Federated Learning (FL) enables training a global model without sharing the decentralized raw data stored on multiple devices to protect data privacy.
We propose a hierarchical synchronous FL framework, i.e., FedHiSyn, to tackle the problems of straggler effects and outdated models.
We evaluate the proposed framework based on MNIST, EMNIST, CIFAR10 and CIFAR100 datasets and diverse heterogeneous settings of devices.
arXiv Detail & Related papers (2022-06-21T17:23:06Z) - Personalized Federated Learning via Convex Clustering [72.15857783681658]
We propose a family of algorithms for personalized federated learning with locally convex user costs.
The proposed framework is based on a generalization of convex clustering in which the differences between different users' models are penalized.
arXiv Detail & Related papers (2022-02-01T19:25:31Z) - Device Scheduling and Update Aggregation Policies for Asynchronous
Federated Learning [72.78668894576515]
Federated Learning (FL) is a newly emerged decentralized machine learning (ML) framework.
We propose an asynchronous FL framework with periodic aggregation to eliminate the straggler issue in FL systems.
arXiv Detail & Related papers (2021-07-23T18:57:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.