FedFA: Federated Feature Augmentation
- URL: http://arxiv.org/abs/2301.12995v1
- Date: Mon, 30 Jan 2023 15:39:55 GMT
- Title: FedFA: Federated Feature Augmentation
- Authors: Tianfei Zhou, Ender Konukoglu
- Abstract summary: Federated learning allows multiple parties to collaboratively train deep models without exchanging raw data.
The primary goal of this paper is to develop a robust federated learning algorithm to address feature shift in clients' samples.
We propose FedFA to tackle federated learning from a distinct perspective of federated feature augmentation.
- Score: 25.130087374092383
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning is a distributed paradigm that allows multiple parties to
collaboratively train deep models without exchanging the raw data. However, the
data distribution among clients is naturally non-i.i.d., which leads to severe
degradation of the learnt model. The primary goal of this paper is to develop a
robust federated learning algorithm to address feature shift in clients'
samples, which can be caused by various factors, e.g., acquisition differences
in medical imaging. To reach this goal, we propose FedFA to tackle federated
learning from a distinct perspective of federated feature augmentation. FedFA
is based on a major insight that each client's data distribution can be
characterized by statistics (i.e., mean and standard deviation) of latent
features; and it is likely to manipulate these local statistics globally, i.e.,
based on information in the entire federation, to let clients have a better
sense of the underlying distribution and therefore alleviate local data bias.
Based on this insight, we propose to augment each local feature statistic
probabilistically based on a normal distribution, whose mean is the original
statistic and variance quantifies the augmentation scope. Key to our approach
is the determination of a meaningful Gaussian variance, which is accomplished
by taking into account not only biased data of each individual client, but also
underlying feature statistics characterized by all participating clients. We
offer both theoretical and empirical justifications to verify the effectiveness
of FedFA. Our code is available at https://github.com/tfzhou/FedFA.
Related papers
- FOOGD: Federated Collaboration for Both Out-of-distribution Generalization and Detection [24.969694113366216]
Federated learning (FL) is a promising machine learning paradigm that collaborates with client models to capture global knowledge.
deploying FL models in real-world scenarios remains unreliable due to the coexistence of in-distribution data and unexpected out-of-distribution data.
We propose FOOGD, a method that estimates the probability density of each client and obtains reliable global distribution.
arXiv Detail & Related papers (2024-10-15T08:39:31Z) - FedImpro: Measuring and Improving Client Update in Federated Learning [77.68805026788836]
Federated Learning (FL) models often experience client drift caused by heterogeneous data.
We present an alternative perspective on client drift and aim to mitigate it by generating improved local models.
arXiv Detail & Related papers (2024-02-10T18:14:57Z) - Federated Learning for distribution skewed data using sample weights [3.6039117546761155]
This work focuses on improving federated learning performance for skewed data distribution across clients.
The main idea is to adjust the client distribution closer to the global distribution using sample weights.
We show that the proposed method not only improves federated learning accuracy but also significantly reduces communication costs.
arXiv Detail & Related papers (2024-01-05T00:46:11Z) - A Simple Data Augmentation for Feature Distribution Skewed Federated
Learning [12.636154758643757]
Federated learning (FL) facilitates collaborative learning among multiple clients in a distributed manner, while ensuring privacy protection.
In this paper, we focus on the feature distribution skewed FL scenario, which is widespread in real-world applications.
We propose FedRDN, a simple yet remarkably effective data augmentation method for feature distribution skewed FL.
arXiv Detail & Related papers (2023-06-14T05:46:52Z) - FedSkip: Combatting Statistical Heterogeneity with Federated Skip
Aggregation [95.85026305874824]
We introduce a data-driven approach called FedSkip to improve the client optima by periodically skipping federated averaging and scattering local models to the cross devices.
We conduct extensive experiments on a range of datasets to demonstrate that FedSkip achieves much higher accuracy, better aggregation efficiency and competing communication efficiency.
arXiv Detail & Related papers (2022-12-14T13:57:01Z) - Beyond ADMM: A Unified Client-variance-reduced Adaptive Federated
Learning Framework [82.36466358313025]
We propose a primal-dual FL algorithm, termed FedVRA, that allows one to adaptively control the variance-reduction level and biasness of the global model.
Experiments based on (semi-supervised) image classification tasks demonstrate superiority of FedVRA over the existing schemes.
arXiv Detail & Related papers (2022-12-03T03:27:51Z) - Personalized Federated Learning via Variational Bayesian Inference [6.671486716769351]
Federated learning faces huge challenges from model overfitting due to the lack of data and statistical diversity among clients.
This paper proposes a novel personalized federated learning method via Bayesian variational inference named pFedBayes.
Experiments show that the proposed method outperforms other advanced personalized methods on personalized models.
arXiv Detail & Related papers (2022-06-16T07:37:02Z) - FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling
and Correction [48.85303253333453]
Federated learning (FL) allows multiple clients to collectively train a high-performance global model without sharing their private data.
We propose a novel federated learning algorithm with local drift decoupling and correction (FedDC)
Our FedDC only introduces lightweight modifications in the local training phase, in which each client utilizes an auxiliary local drift variable to track the gap between the local model parameter and the global model parameters.
Experiment results and analysis demonstrate that FedDC yields expediting convergence and better performance on various image classification tasks.
arXiv Detail & Related papers (2022-03-22T14:06:26Z) - Towards Fair Federated Learning with Zero-Shot Data Augmentation [123.37082242750866]
Federated learning has emerged as an important distributed learning paradigm, where a server aggregates a global model from many client-trained models while having no access to the client data.
We propose a novel federated learning system that employs zero-shot data augmentation on under-represented data to mitigate statistical heterogeneity and encourage more uniform accuracy performance across clients in federated networks.
We study two variants of this scheme, Fed-ZDAC (federated learning with zero-shot data augmentation at the clients) and Fed-ZDAS (federated learning with zero-shot data augmentation at the server).
arXiv Detail & Related papers (2021-04-27T18:23:54Z) - FedBN: Federated Learning on Non-IID Features via Local Batch
Normalization [23.519212374186232]
The emerging paradigm of federated learning (FL) strives to enable collaborative training of deep models on the network edge without centrally aggregating raw data.
We propose an effective method that uses local batch normalization to alleviate the feature shift before averaging models.
The resulting scheme, called FedBN, outperforms both classical FedAvg and the state-of-the-art for non-iid data.
arXiv Detail & Related papers (2021-02-15T16:04:10Z) - WAFFLe: Weight Anonymized Factorization for Federated Learning [88.44939168851721]
In domains where data are sensitive or private, there is great value in methods that can learn in a distributed manner without the data ever leaving the local devices.
We propose Weight Anonymized Factorization for Federated Learning (WAFFLe), an approach that combines the Indian Buffet Process with a shared dictionary of weight factors for neural networks.
arXiv Detail & Related papers (2020-08-13T04:26:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.