Personalized Federated Learning with Gaussian Processes
- URL: http://arxiv.org/abs/2106.15482v1
- Date: Tue, 29 Jun 2021 15:09:13 GMT
- Title: Personalized Federated Learning with Gaussian Processes
- Authors: Idan Achituve, Aviv Shamsian, Aviv Navon, Gal Chechik, Ethan Fetaya
- Abstract summary: Federated learning aims to learn a global model that performs well on client devices with limited cross-client communication.
We present pFedGP, a solution to PFL that is based on Gaussian processes (GPs) with deep kernel learning.
pFedGP achieves well-calibrated predictions while significantly outperforming baseline methods, reaching up to 21% in accuracy gain.
- Score: 24.102107455189454
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Federated learning aims to learn a global model that performs well on client
devices with limited cross-client communication. Personalized federated
learning (PFL) further extends this setup to handle data heterogeneity between
clients by learning personalized models. A key challenge in this setting is to
learn effectively across clients even though each client has unique data that
is often limited in size. Here we present pFedGP, a solution to PFL that is
based on Gaussian processes (GPs) with deep kernel learning. GPs are highly
expressive models that work well in the low data regime due to their Bayesian
nature. However, applying GPs to PFL raises multiple challenges. Mainly, GPs
performance depends heavily on access to a good kernel function, and learning a
kernel requires a large training set. Therefore, we propose learning a shared
kernel function across all clients, parameterized by a neural network, with a
personal GP classifier for each client. We further extend pFedGP to include
inducing points using two novel methods, the first helps to improve
generalization in the low data regime and the second reduces the computational
cost. We derive a PAC-Bayes generalization bound on novel clients and
empirically show that it gives non-vacuous guarantees. Extensive experiments on
standard PFL benchmarks with CIFAR-10, CIFAR-100, and CINIC-10, and on a new
setup of learning under input noise show that pFedGP achieves well-calibrated
predictions while significantly outperforming baseline methods, reaching up to
21% in accuracy gain.
Related papers
- GPFL: A Gradient Projection-Based Client Selection Framework for Efficient Federated Learning [6.717563725609496]
Federated learning client selection is crucial for determining participant clients.
We propose GPFL, which measures client value by comparing local and global descent directions.
GPFL exhibits shorter computation times through pre-selection and parameter reuse in federated learning.
arXiv Detail & Related papers (2024-03-26T16:14:43Z) - Personalized Federated Learning of Probabilistic Models: A PAC-Bayesian
Approach [42.59649764999974]
Federated learning aims to infer a shared model from private and decentralized data stored locally by multiple clients.
We propose a PFL algorithm named PAC-PFL for learning probabilistic models within a PAC-Bayesian framework.
Our algorithm collaboratively learns a shared hyper-posterior and regards each client's posterior inference as the step personalization.
arXiv Detail & Related papers (2024-01-16T13:30:37Z) - FedTGP: Trainable Global Prototypes with Adaptive-Margin-Enhanced
Contrastive Learning for Data and Model Heterogeneity in Federated Learning [18.916282151435727]
Heterogeneous Federated Learning (HtFL) has attracted attention due to its ability to support heterogeneous models and data.
We introduce a novel HtFL approach called FedTGP, which leverages our Adaptive-margin-enhanced Contrastive Learning (ACL) to learn Trainable Global Prototypes (TGP) on the server.
arXiv Detail & Related papers (2024-01-06T14:43:47Z) - FedSampling: A Better Sampling Strategy for Federated Learning [81.85411484302952]
Federated learning (FL) is an important technique for learning models from decentralized data in a privacy-preserving way.
Existing FL methods usually uniformly sample clients for local model learning in each round.
We propose a novel data uniform sampling strategy for federated learning (FedSampling)
arXiv Detail & Related papers (2023-06-25T13:38:51Z) - Federated Learning for Semantic Parsing: Task Formulation, Evaluation
Setup, New Algorithms [29.636944156801327]
Multiple clients collaboratively train one global model without sharing their semantic parsing data.
Lorar adjusts each client's contribution to the global model update based on its training loss reduction during each round.
Clients with smaller datasets enjoy larger performance gains.
arXiv Detail & Related papers (2023-05-26T19:25:49Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - PerAda: Parameter-Efficient Federated Learning Personalization with Generalization Guarantees [95.87604231887353]
Existing pFL methods introduce high communication and computation costs or are vulnerable to test communication.
In PerAda, a parameter distillation and pFL pFL has superior performance, especially under test-time distribution.
Our code is available at https://github.com/NV/PerAda.
arXiv Detail & Related papers (2023-02-13T19:00:37Z) - Federated Bayesian Neural Regression: A Scalable Global Federated
Gaussian Process [21.872163101238705]
Federated Bayesian Neural Regression (FedBNR) is an algorithm that learns a scalable stand-alone global GP that respects clients' privacy.
We derive a principled approach of learning a global predictive model as if all client data is centralized.
Experiments are conducted on real-world regression datasets and show statistically significant improvements compared to other federated GP models.
arXiv Detail & Related papers (2022-06-13T17:52:58Z) - No One Left Behind: Inclusive Federated Learning over Heterogeneous
Devices [79.16481453598266]
We propose InclusiveFL, a client-inclusive federated learning method to handle this problem.
The core idea of InclusiveFL is to assign models of different sizes to clients with different computing capabilities.
We also propose an effective method to share the knowledge among multiple local models with different sizes.
arXiv Detail & Related papers (2022-02-16T13:03:27Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Towards Fair Federated Learning with Zero-Shot Data Augmentation [123.37082242750866]
Federated learning has emerged as an important distributed learning paradigm, where a server aggregates a global model from many client-trained models while having no access to the client data.
We propose a novel federated learning system that employs zero-shot data augmentation on under-represented data to mitigate statistical heterogeneity and encourage more uniform accuracy performance across clients in federated networks.
We study two variants of this scheme, Fed-ZDAC (federated learning with zero-shot data augmentation at the clients) and Fed-ZDAS (federated learning with zero-shot data augmentation at the server).
arXiv Detail & Related papers (2021-04-27T18:23:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.