An Optimal Transport Approach to Personalized Federated Learning
- URL: http://arxiv.org/abs/2206.02468v1
- Date: Mon, 6 Jun 2022 10:05:49 GMT
- Title: An Optimal Transport Approach to Personalized Federated Learning
- Authors: Farzan Farnia, Amirhossein Reisizadeh, Ramtin Pedarsani, Ali Jadbabaie
- Abstract summary: Federated learning aims to train a model using the local data of many distributed clients.
A key challenge in federated learning is that the data samples across the clients may not be identically distributed.
We propose a novel personalized Federated Learning scheme based on Optimal Transport (FedOT)
- Score: 41.27887358989414
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Federated learning is a distributed machine learning paradigm, which aims to
train a model using the local data of many distributed clients. A key challenge
in federated learning is that the data samples across the clients may not be
identically distributed. To address this challenge, personalized federated
learning with the goal of tailoring the learned model to the data distribution
of every individual client has been proposed. In this paper, we focus on this
problem and propose a novel personalized Federated Learning scheme based on
Optimal Transport (FedOT) as a learning algorithm that learns the optimal
transport maps for transferring data points to a common distribution as well as
the prediction model under the applied transport map. To formulate the FedOT
problem, we extend the standard optimal transport task between two probability
distributions to multi-marginal optimal transport problems with the goal of
transporting samples from multiple distributions to a common probability
domain. We then leverage the results on multi-marginal optimal transport
problems to formulate FedOT as a min-max optimization problem and analyze its
generalization and optimization properties. We discuss the results of several
numerical experiments to evaluate the performance of FedOT under heterogeneous
data distributions in federated learning problems.
Related papers
- Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal
Heterogeneous Federated Learning [37.96957782129352]
We propose a finetuning framework tailored to heterogeneous multi-modal foundation models, called Federated Dual-Aadapter Teacher (Fed DAT)
Fed DAT addresses data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer.
To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity.
arXiv Detail & Related papers (2023-08-21T21:57:01Z) - Tackling Computational Heterogeneity in FL: A Few Theoretical Insights [68.8204255655161]
We introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneous data.
Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
arXiv Detail & Related papers (2023-07-12T16:28:21Z) - Performative Federated Learning: A Solution to Model-Dependent and
Heterogeneous Distribution Shifts [24.196279060605402]
We consider a federated learning (FL) system consisting of multiple clients and a server.
Unlike the conventional FL framework that assumes the client's data is static, we consider scenarios where the clients' data distributions may be reshaped by the deployed decision model.
arXiv Detail & Related papers (2023-05-08T23:29:24Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Straggler-Resilient Personalized Federated Learning [55.54344312542944]
Federated learning allows training models from samples distributed across a large network of clients while respecting privacy and communication restrictions.
We develop a novel algorithmic procedure with theoretical speedup guarantees that simultaneously handles two of these hurdles.
Our method relies on ideas from representation learning theory to find a global common representation using all clients' data and learn a user-specific set of parameters leading to a personalized solution for each client.
arXiv Detail & Related papers (2022-06-05T01:14:46Z) - DRFLM: Distributionally Robust Federated Learning with Inter-client
Noise via Local Mixup [58.894901088797376]
federated learning has emerged as a promising approach for training a global model using data from multiple organizations without leaking their raw data.
We propose a general framework to solve the above two challenges simultaneously.
We provide comprehensive theoretical analysis including robustness analysis, convergence analysis, and generalization ability.
arXiv Detail & Related papers (2022-04-16T08:08:29Z) - Federated Multi-Task Learning under a Mixture of Distributions [10.00087964926414]
Federated Learning (FL) is a framework for on-device collaborative training of machine learning models.
First efforts in FL focused on learning a single global model with good average performance across clients, but the global model may be arbitrarily bad for a given client.
We study federated MTL under the flexible assumption that each local data distribution is a mixture of unknown underlying distributions.
arXiv Detail & Related papers (2021-08-23T15:47:53Z) - Communication-Efficient Hierarchical Federated Learning for IoT
Heterogeneous Systems with Imbalanced Data [42.26599494940002]
Federated learning (FL) is a distributed learning methodology that allows multiple nodes to cooperatively train a deep learning model.
This paper studies the potential of hierarchical FL in IoT heterogeneous systems.
It proposes an optimized solution for user assignment and resource allocation on multiple edge nodes.
arXiv Detail & Related papers (2021-07-14T08:32:39Z) - Optimal transport framework for efficient prototype selection [21.620708125860066]
We develop an optimal transport (OT) based framework to select informative examples that best represent a given target dataset.
We show that our objective function enjoys a key property of submodularity and propose a parallelizable greedy method that is both computationally fast and possess deterministic approximation guarantees.
arXiv Detail & Related papers (2021-03-18T10:50:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.