FedDAR: Federated Domain-Aware Representation Learning
- URL: http://arxiv.org/abs/2209.04007v1
- Date: Thu, 8 Sep 2022 19:18:59 GMT
- Title: FedDAR: Federated Domain-Aware Representation Learning
- Authors: Aoxiao Zhong, Hao He, Zhaolin Ren, Na Li, Quanzheng Li
- Abstract summary: Cross-silo Federated learning (FL) has become a promising tool in machine learning applications for healthcare.
We propose a novel method, FedDAR, which learns a domain shared representation and domain-wise personalized prediction heads.
- Score: 14.174833360938806
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cross-silo Federated learning (FL) has become a promising tool in machine
learning applications for healthcare. It allows hospitals/institutions to train
models with sufficient data while the data is kept private. To make sure the FL
model is robust when facing heterogeneous data among FL clients, most efforts
focus on personalizing models for clients. However, the latent relationships
between clients' data are ignored. In this work, we focus on a special non-iid
FL problem, called Domain-mixed FL, where each client's data distribution is
assumed to be a mixture of several predefined domains. Recognizing the
diversity of domains and the similarity within domains, we propose a novel
method, FedDAR, which learns a domain shared representation and domain-wise
personalized prediction heads in a decoupled manner. For simplified linear
regression settings, we have theoretically proved that FedDAR enjoys a linear
convergence rate. For general settings, we have performed intensive empirical
studies on both synthetic and real-world medical datasets which demonstrate its
superiority over prior FL methods.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - Eliminating Domain Bias for Federated Learning in Representation Space [31.52707182599217]
We propose a general framework Domain Bias Eliminator (DBE) for federated learning.
Our theoretical analysis reveals that DBE can promote bi-directional knowledge transfer between server and client.
The DBE-equipped FL method can outperform ten state-of-the-art personalized FL methods by a large margin.
arXiv Detail & Related papers (2023-11-25T09:22:34Z) - FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal
Heterogeneous Federated Learning [37.96957782129352]
We propose a finetuning framework tailored to heterogeneous multi-modal foundation models, called Federated Dual-Aadapter Teacher (Fed DAT)
Fed DAT addresses data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer.
To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity.
arXiv Detail & Related papers (2023-08-21T21:57:01Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - FedILC: Weighted Geometric Mean and Invariant Gradient Covariance for
Federated Learning on Non-IID Data [69.0785021613868]
Federated learning is a distributed machine learning approach which enables a shared server model to learn by aggregating the locally-computed parameter updates with the training data from spatially-distributed client silos.
We propose the Federated Invariant Learning Consistency (FedILC) approach, which leverages the gradient covariance and the geometric mean of Hessians to capture both inter-silo and intra-silo consistencies.
This is relevant to various fields such as medical healthcare, computer vision, and the Internet of Things (IoT)
arXiv Detail & Related papers (2022-05-19T03:32:03Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Federated learning with hierarchical clustering of local updates to
improve training on non-IID data [3.3517146652431378]
We show that learning a single joint model is often not optimal in the presence of certain types of non-iid data.
We present a modification to FL by introducing a hierarchical clustering step (FL+HC)
We show how FL+HC allows model training to converge in fewer communication rounds compared to FL without clustering.
arXiv Detail & Related papers (2020-04-24T15:16:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.