Federated Learning via Variational Bayesian Inference: Personalization,
Sparsity and Clustering
- URL: http://arxiv.org/abs/2303.04345v1
- Date: Wed, 8 Mar 2023 02:52:40 GMT
- Title: Federated Learning via Variational Bayesian Inference: Personalization,
Sparsity and Clustering
- Authors: Xu Zhang, Wenpeng Li, Yunfeng Shao, Yinchuan Li
- Abstract summary: Federated learning (FL) is a promising framework that models distributed machine learning.
FL suffers performance degradation from heterogeneous and limited data.
We present a novel personalized Bayesian FL approach named pFedBayes and a clustered FL model named cFedbayes.
- Score: 6.829317124629158
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning (FL) is a promising framework that models distributed
machine learning while protecting the privacy of clients. However, FL suffers
performance degradation from heterogeneous and limited data. To alleviate the
degradation, we present a novel personalized Bayesian FL approach named
pFedBayes. By using the trained global distribution from the server as the
prior distribution of each client, each client adjusts its own distribution by
minimizing the sum of the reconstruction error over its personalized data and
the KL divergence with the downloaded global distribution. Then, we propose a
sparse personalized Bayesian FL approach named sFedBayes. To overcome the
extreme heterogeneity in non-i.i.d. data, we propose a clustered Bayesian FL
model named cFedbayes by learning different prior distributions for different
clients. Theoretical analysis gives the generalization error bound of three
approaches and shows that the generalization error convergence rates of the
proposed approaches achieve minimax optimality up to a logarithmic factor.
Moreover, the analysis presents that cFedbayes has a tighter generalization
error rate than pFedBayes. Numerous experiments are provided to demonstrate
that the proposed approaches have better performance than other advanced
personalized methods on private models in the presence of heterogeneous and
limited data.
Related papers
- FedImpro: Measuring and Improving Client Update in Federated Learning [77.68805026788836]
Federated Learning (FL) models often experience client drift caused by heterogeneous data.
We present an alternative perspective on client drift and aim to mitigate it by generating improved local models.
arXiv Detail & Related papers (2024-02-10T18:14:57Z) - Confidence-aware Personalized Federated Learning via Variational
Expectation Maximization [34.354154518009956]
We present a novel framework for personalized Federated Learning (PFL)
PFL is a distributed learning scheme to train a shared model across clients.
We present a novel framework for PFL based on hierarchical modeling and variational inference.
arXiv Detail & Related papers (2023-05-21T20:12:27Z) - FedHB: Hierarchical Bayesian Federated Learning [11.936836827864095]
We propose a novel hierarchical Bayesian approach to Federated Learning (FL)
Our model reasonably describes the generative process of clients' local data via hierarchical Bayesian modeling.
We show that our block-coordinate FL algorithm converges to an optimum of the objective at the rate of $O(sqrtt)$.
arXiv Detail & Related papers (2023-05-08T18:21:41Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - FedLAP-DP: Federated Learning by Sharing Differentially Private Loss Approximations [53.268801169075836]
We propose FedLAP-DP, a novel privacy-preserving approach for federated learning.
A formal privacy analysis demonstrates that FedLAP-DP incurs the same privacy costs as typical gradient-sharing schemes.
Our approach presents a faster convergence speed compared to typical gradient-sharing methods.
arXiv Detail & Related papers (2023-02-02T12:56:46Z) - Beyond ADMM: A Unified Client-variance-reduced Adaptive Federated
Learning Framework [82.36466358313025]
We propose a primal-dual FL algorithm, termed FedVRA, that allows one to adaptively control the variance-reduction level and biasness of the global model.
Experiments based on (semi-supervised) image classification tasks demonstrate superiority of FedVRA over the existing schemes.
arXiv Detail & Related papers (2022-12-03T03:27:51Z) - Personalized Federated Learning via Variational Bayesian Inference [6.671486716769351]
Federated learning faces huge challenges from model overfitting due to the lack of data and statistical diversity among clients.
This paper proposes a novel personalized federated learning method via Bayesian variational inference named pFedBayes.
Experiments show that the proposed method outperforms other advanced personalized methods on personalized models.
arXiv Detail & Related papers (2022-06-16T07:37:02Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.