Think Locally, Act Globally: Federated Learning with Local and Global
Representations
- URL: http://arxiv.org/abs/2001.01523v3
- Date: Tue, 14 Jul 2020 08:12:35 GMT
- Title: Think Locally, Act Globally: Federated Learning with Local and Global
Representations
- Authors: Paul Pu Liang, Terrance Liu, Liu Ziyin, Nicholas B. Allen, Randy P.
Auerbach, David Brent, Ruslan Salakhutdinov, Louis-Philippe Morency
- Abstract summary: Federated learning is a method of training models on private data distributed over multiple devices.
We propose a new federated learning algorithm that jointly learns compact local representations on each device.
We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key.
- Score: 92.68484710504666
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning is a method of training models on private data distributed
over multiple devices. To keep device data private, the global model is trained
by only communicating parameters and updates which poses scalability challenges
for large models. To this end, we propose a new federated learning algorithm
that jointly learns compact local representations on each device and a global
model across all devices. As a result, the global model can be smaller since it
only operates on local representations, reducing the number of communicated
parameters. Theoretically, we provide a generalization analysis which shows
that a combination of local and global models reduces both variance in the data
as well as variance across device distributions. Empirically, we demonstrate
that local models enable communication-efficient training while retaining
performance. We also evaluate on the task of personalized mood prediction from
real-world mobile data where privacy is key. Finally, local models handle
heterogeneous data from new devices, and learn fair representations that
obfuscate protected attributes such as race, age, and gender.
Related papers
- FedDistill: Global Model Distillation for Local Model De-Biasing in Non-IID Federated Learning [10.641875933652647]
Federated Learning (FL) is a novel approach that allows for collaborative machine learning.
FL faces challenges due to non-uniformly distributed (non-iid) data across clients.
This paper introduces FedDistill, a framework enhancing the knowledge transfer from the global model to local models.
arXiv Detail & Related papers (2024-04-14T10:23:30Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - FedSoup: Improving Generalization and Personalization in Federated
Learning via Selective Model Interpolation [32.36334319329364]
Cross-silo federated learning (FL) enables the development of machine learning models on datasets distributed across data centers.
Recent research has found that current FL algorithms face a trade-off between local and global performance when confronted with distribution shifts.
We propose a novel federated model soup method to optimize the trade-off between local and global performance.
arXiv Detail & Related papers (2023-07-20T00:07:29Z) - Federated Learning of Models Pre-Trained on Different Features with
Consensus Graphs [19.130197923214123]
Learning an effective global model on private and decentralized datasets has become an increasingly important challenge of machine learning.
We propose a feature fusion approach that extracts local representations from local models and incorporates them into a global representation that improves the prediction performance.
This paper presents solutions to these problems and demonstrates them in real-world applications on time series data such as power grids and traffic networks.
arXiv Detail & Related papers (2023-06-02T02:24:27Z) - Integrating Local Real Data with Global Gradient Prototypes for
Classifier Re-Balancing in Federated Long-Tailed Learning [60.41501515192088]
Federated Learning (FL) has become a popular distributed learning paradigm that involves multiple clients training a global model collaboratively.
The data samples usually follow a long-tailed distribution in the real world, and FL on the decentralized and long-tailed data yields a poorly-behaved global model.
In this work, we integrate the local real data with the global gradient prototypes to form the local balanced datasets.
arXiv Detail & Related papers (2023-01-25T03:18:10Z) - Tackling Data Heterogeneity in Federated Learning with Class Prototypes [44.746340839025194]
We propose FedNH, a novel method that improves the local models' performance for both personalization and generalization.
We show that imposing uniformity helps to combat prototype collapse while infusing class semantics improves local models.
arXiv Detail & Related papers (2022-12-06T05:15:38Z) - Federated Learning from Small Datasets [48.879172201462445]
Federated learning allows multiple parties to collaboratively train a joint model without sharing local data.
We propose a novel approach that intertwines model aggregations with permutations of local models.
The permutations expose each local model to a daisy chain of local datasets resulting in more efficient training in data-sparse domains.
arXiv Detail & Related papers (2021-10-07T13:49:23Z) - Federated Learning with Downlink Device Selection [92.14944020945846]
We study federated edge learning, where a global model is trained collaboratively using privacy-sensitive data at the edge of a wireless network.
A parameter server (PS) keeps track of the global model and shares it with the wireless edge devices for training using their private local data.
We consider device selection based on downlink channels over which the PS shares the global model with the devices.
arXiv Detail & Related papers (2021-07-07T22:42:39Z) - Federated Learning With Quantized Global Model Updates [84.55126371346452]
We study federated learning, which enables mobile devices to utilize their local datasets to train a global model.
We introduce a lossy FL (LFL) algorithm, in which both the global model and the local model updates are quantized before being transmitted.
arXiv Detail & Related papers (2020-06-18T16:55:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.