Sparse Federated Learning with Hierarchical Personalized Models
- URL: http://arxiv.org/abs/2203.13517v3
- Date: Mon, 25 Sep 2023 06:36:52 GMT
- Title: Sparse Federated Learning with Hierarchical Personalized Models
- Authors: Xiaofeng Liu, Qing Wang, Yunfeng Shao, Yinchuan Li
- Abstract summary: Federated learning (FL) can achieve privacy-safe and reliable collaborative training without collecting users' private data.
We propose a personalized FL algorithm using a hierarchical proximal mapping based on the moreau envelop, named sparse federated learning with hierarchical personalized models (sFedHP)
A continuously differentiable approximated L1-norm is also used as the sparse constraint to reduce the communication cost.
- Score: 24.763028713043468
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) can achieve privacy-safe and reliable collaborative
training without collecting users' private data. Its excellent privacy security
potential promotes a wide range of FL applications in Internet-of-Things (IoT),
wireless networks, mobile devices, autonomous vehicles, and cloud medical
treatment. However, the FL method suffers from poor model performance on
non-i.i.d. data and excessive traffic volume. To this end, we propose a
personalized FL algorithm using a hierarchical proximal mapping based on the
moreau envelop, named sparse federated learning with hierarchical personalized
models (sFedHP), which significantly improves the global model performance
facing diverse data. A continuously differentiable approximated L1-norm is also
used as the sparse constraint to reduce the communication cost. Convergence
analysis shows that sFedHP's convergence rate is state-of-the-art with linear
speedup and the sparse constraint only reduces the convergence rate to a small
extent while significantly reducing the communication cost. Experimentally, we
demonstrate the benefits of sFedHP compared with the FedAvg, HierFAVG
(hierarchical FedAvg), and personalized FL methods based on local
customization, including FedAMP, FedProx, Per-FedAvg, pFedMe, and pFedGP.
Related papers
- User-Centric Federated Learning: Trading off Wireless Resources for
Personalization [18.38078866145659]
In Federated Learning (FL) systems, Statistical Heterogeneousness increases the algorithm convergence time and reduces the generalization performance.
To tackle the above problems without violating the privacy constraints that FL imposes, personalized FL methods have to couple statistically similar clients without directly accessing their data.
In this work, we design user-centric aggregation rules that are based on readily available gradient information and are capable of producing personalized models for each FL client.
Our algorithm outperforms popular personalized FL baselines in terms of average accuracy, worst node performance, and training communication overhead.
arXiv Detail & Related papers (2023-04-25T15:45:37Z) - FedLAP-DP: Federated Learning by Sharing Differentially Private Loss Approximations [53.268801169075836]
We propose FedLAP-DP, a novel privacy-preserving approach for federated learning.
A formal privacy analysis demonstrates that FedLAP-DP incurs the same privacy costs as typical gradient-sharing schemes.
Our approach presents a faster convergence speed compared to typical gradient-sharing methods.
arXiv Detail & Related papers (2023-02-02T12:56:46Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Achieving Personalized Federated Learning with Sparse Local Models [75.76854544460981]
Federated learning (FL) is vulnerable to heterogeneously distributed data.
To counter this issue, personalized FL (PFL) was proposed to produce dedicated local models for each individual user.
Existing PFL solutions either demonstrate unsatisfactory generalization towards different model architectures or cost enormous extra computation and memory.
We proposeFedSpa, a novel PFL scheme that employs personalized sparse masks to customize sparse local models on the edge.
arXiv Detail & Related papers (2022-01-27T08:43:11Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Privacy Assessment of Federated Learning using Private Personalized
Layers [0.9023847175654603]
Federated Learning (FL) is a collaborative scheme to train a learning model across multiple participants without sharing data.
We quantify the utility and privacy trade-off of a FL scheme using private personalized layers.
arXiv Detail & Related papers (2021-06-15T11:40:16Z) - Wireless Federated Learning with Limited Communication and Differential
Privacy [21.328507360172203]
This paper investigates the role of dimensionality reduction in efficient communication and differential privacy (DP) of the local datasets at the remote users for over-the-air computation (AirComp)-based federated learning (FL) model.
arXiv Detail & Related papers (2021-06-01T15:23:12Z) - Over-the-Air Federated Learning from Heterogeneous Data [107.05618009955094]
Federated learning (FL) is a framework for distributed learning of centralized models.
We develop a Convergent OTA FL (COTAF) algorithm which enhances the common local gradient descent (SGD) FL algorithm.
We numerically show that the precoding induced by COTAF notably improves the convergence rate and the accuracy of models trained via OTA FL.
arXiv Detail & Related papers (2020-09-27T08:28:25Z) - Differentially Private Federated Learning with Laplacian Smoothing [72.85272874099644]
Federated learning aims to protect data privacy by collaboratively learning a model without sharing private data among users.
An adversary may still be able to infer the private training data by attacking the released model.
Differential privacy provides a statistical protection against such attacks at the price of significantly degrading the accuracy or utility of the trained models.
arXiv Detail & Related papers (2020-05-01T04:28:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.