Federated Select: A Primitive for Communication- and Memory-Efficient
Federated Learning
- URL: http://arxiv.org/abs/2208.09432v1
- Date: Fri, 19 Aug 2022 16:26:03 GMT
- Title: Federated Select: A Primitive for Communication- and Memory-Efficient
Federated Learning
- Authors: Zachary Charles, Kallista Bonawitz, Stanislav Chiknavaryan, Brendan
McMahan, Blaise Ag\"uera y Arcas
- Abstract summary: Federated learning (FL) is a framework for machine learning across heterogeneous client devices.
We propose a more general procedure in which clients "select" what values are sent to them.
This allows clients to operate on smaller, data-dependent slices.
- Score: 4.873569522869751
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is a framework for machine learning across
heterogeneous client devices in a privacy-preserving fashion. To date, most FL
algorithms learn a "global" server model across multiple rounds. At each round,
the same server model is broadcast to all participating clients, updated
locally, and then aggregated across clients. In this work, we propose a more
general procedure in which clients "select" what values are sent to them.
Notably, this allows clients to operate on smaller, data-dependent slices. In
order to make this practical, we outline a primitive, federated select, which
enables client-specific selection in realistic FL systems. We discuss how to
use federated select for model training and show that it can lead to drastic
reductions in communication and client memory usage, potentially enabling the
training of models too large to fit on-device. We also discuss the implications
of federated select on privacy and trust, which in turn affect possible system
constraints and design. Finally, we discuss open questions concerning model
architectures, privacy-preserving technologies, and practical FL systems.
Related papers
- Knowledge-Enhanced Semi-Supervised Federated Learning for Aggregating
Heterogeneous Lightweight Clients in IoT [34.128674870180596]
Federated learning (FL) enables multiple clients to train models collaboratively without sharing local data.
We propose pFedKnow, which generates lightweight personalized client models via neural network pruning techniques to reduce communication cost.
Experiment results on both image and text datasets show that the proposed pFedKnow outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2023-03-05T13:19:10Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - ON-DEMAND-FL: A Dynamic and Efficient Multi-Criteria Federated Learning
Client Deployment Scheme [37.099990745974196]
We introduce an On-Demand-FL, a client deployment approach for federated learning.
We make use of containerization technology such as Docker to build efficient environments.
The Genetic algorithm (GA) is used to solve the multi-objective optimization problem.
arXiv Detail & Related papers (2022-11-05T13:41:19Z) - No One Left Behind: Inclusive Federated Learning over Heterogeneous
Devices [79.16481453598266]
We propose InclusiveFL, a client-inclusive federated learning method to handle this problem.
The core idea of InclusiveFL is to assign models of different sizes to clients with different computing capabilities.
We also propose an effective method to share the knowledge among multiple local models with different sizes.
arXiv Detail & Related papers (2022-02-16T13:03:27Z) - An Expectation-Maximization Perspective on Federated Learning [75.67515842938299]
Federated learning describes the distributed training of models across multiple clients while keeping the data private on-device.
In this work, we view the server-orchestrated federated learning process as a hierarchical latent variable model where the server provides the parameters of a prior distribution over the client-specific model parameters.
We show that with simple Gaussian priors and a hard version of the well known Expectation-Maximization (EM) algorithm, learning in such a model corresponds to FedAvg, the most popular algorithm for the federated learning setting.
arXiv Detail & Related papers (2021-11-19T12:58:59Z) - FedGEMS: Federated Learning of Larger Server Models via Selective
Knowledge Fusion [19.86388925556209]
Federated Learning (FL) has emerged as a viable solution to learn a global model while keeping data private.
In this work, we investigate a novel paradigm to take advantage of a powerful server model to break through model capacity in FL.
arXiv Detail & Related papers (2021-10-21T10:06:44Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z) - FedNS: Improving Federated Learning for collaborative image
classification on mobile clients [22.980223900446997]
Federated Learning (FL) is a paradigm that aims to support loosely connected clients in learning a global model.
We propose a new approach, termed Federated Node Selection (FedNS), for the server's global model aggregation in the FL setting.
We show with experiments from multiple datasets and networks that FedNS can consistently achieve improved performance over FedAvg.
arXiv Detail & Related papers (2021-01-20T06:45:46Z) - Personalized Federated Learning with First Order Model Optimization [76.81546598985159]
We propose an alternative to federated learning, where each client federates with other relevant clients to obtain a stronger model per client-specific objectives.
We do not assume knowledge of underlying data distributions or client similarities, and allow each client to optimize for arbitrary target distributions of interest.
Our method outperforms existing alternatives, while also enabling new features for personalized FL such as transfer outside of local data distributions.
arXiv Detail & Related papers (2020-12-15T19:30:29Z) - Ensemble Distillation for Robust Model Fusion in Federated Learning [72.61259487233214]
Federated Learning (FL) is a machine learning setting where many devices collaboratively train a machine learning model.
In most of the current training schemes the central model is refined by averaging the parameters of the server model and the updated parameters from the client side.
We propose ensemble distillation for model fusion, i.e. training the central classifier through unlabeled data on the outputs of the models from the clients.
arXiv Detail & Related papers (2020-06-12T14:49:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.