Graph-Assisted Communication-Efficient Ensemble Federated Learning
- URL: http://arxiv.org/abs/2202.13447v1
- Date: Sun, 27 Feb 2022 20:25:44 GMT
- Title: Graph-Assisted Communication-Efficient Ensemble Federated Learning
- Authors: Pouya M Ghari and Yanning Shen
- Abstract summary: Communication efficiency arises as a necessity in federated learning due to limited communication bandwidth.
Server selects a subset of pre-trained models to construct the ensemble model based on the structure of a graph.
Only the selected models are transmitted to the clients, such that certain budget constraints are not violated.
- Score: 12.538755088321404
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Communication efficiency arises as a necessity in federated learning due to
limited communication bandwidth. To this end, the present paper develops an
algorithmic framework where an ensemble of pre-trained models is learned. At
each learning round, the server selects a subset of pre-trained models to
construct the ensemble model based on the structure of a graph, which
characterizes the server's confidence in the models. Then only the selected
models are transmitted to the clients, such that certain budget constraints are
not violated. Upon receiving updates from the clients, the server refines the
structure of the graph accordingly. The proposed algorithm is proved to enjoy
sub-linear regret bound. Experiments on real datasets demonstrate the
effectiveness of our novel approach.
Related papers
- Personalized Federated Learning with Mixture of Models for Adaptive Prediction and Model Fine-Tuning [22.705411388403036]
This paper develops a novel personalized federated learning algorithm.
Each client constructs a personalized model by combining a locally fine-tuned model with multiple federated models.
Theoretical analysis and experiments on real datasets corroborate the effectiveness of this approach.
arXiv Detail & Related papers (2024-10-28T21:20:51Z) - FedSheafHN: Personalized Federated Learning on Graph-structured Data [22.825083541211168]
We propose a model called FedSheafHN, which embeds each client's local subgraph into a server-constructed collaboration graph.
Our model improves the integration and interpretation of complex client characteristics.
It also has fast model convergence and effective new clients generalization.
arXiv Detail & Related papers (2024-05-25T04:51:41Z) - Rethinking Personalized Federated Learning with Clustering-based Dynamic
Graph Propagation [48.08348593449897]
We propose a simple yet effective personalized federated learning framework.
We group clients into multiple clusters based on their model training status and data distribution on the server side.
We conduct experiments on three image benchmark datasets and create synthetic structured datasets with three types of typologies.
arXiv Detail & Related papers (2024-01-29T04:14:02Z) - Structured Cooperative Learning with Graphical Model Priors [98.53322192624594]
We study how to train personalized models for different tasks on decentralized devices with limited local data.
We propose "Structured Cooperative Learning (SCooL)", in which a cooperation graph across devices is generated by a graphical model.
We evaluate SCooL and compare it with existing decentralized learning methods on an extensive set of benchmarks.
arXiv Detail & Related papers (2023-06-16T02:41:31Z) - Personalizing Federated Learning with Over-the-Air Computations [84.8089761800994]
Federated edge learning is a promising technology to deploy intelligence at the edge of wireless networks in a privacy-preserving manner.
Under such a setting, multiple clients collaboratively train a global generic model under the coordination of an edge server.
This paper presents a distributed training paradigm that employs analog over-the-air computation to address the communication bottleneck.
arXiv Detail & Related papers (2023-02-24T08:41:19Z) - Robust Graph Representation Learning via Predictive Coding [46.22695915912123]
Predictive coding is a message-passing framework initially developed to model information processing in the brain.
In this work, we build models that rely on the message-passing rule of predictive coding.
We show that the proposed models are comparable to standard ones in terms of performance in both inductive and transductive tasks.
arXiv Detail & Related papers (2022-12-09T03:58:22Z) - An Expectation-Maximization Perspective on Federated Learning [75.67515842938299]
Federated learning describes the distributed training of models across multiple clients while keeping the data private on-device.
In this work, we view the server-orchestrated federated learning process as a hierarchical latent variable model where the server provides the parameters of a prior distribution over the client-specific model parameters.
We show that with simple Gaussian priors and a hard version of the well known Expectation-Maximization (EM) algorithm, learning in such a model corresponds to FedAvg, the most popular algorithm for the federated learning setting.
arXiv Detail & Related papers (2021-11-19T12:58:59Z) - Data Summarization via Bilevel Optimization [48.89977988203108]
A simple yet powerful approach is to operate on small subsets of data.
In this work, we propose a generic coreset framework that formulates the coreset selection as a cardinality-constrained bilevel optimization problem.
arXiv Detail & Related papers (2021-09-26T09:08:38Z) - GraphFormers: GNN-nested Transformers for Representation Learning on
Textual Graph [53.70520466556453]
We propose GraphFormers, where layerwise GNN components are nested alongside the transformer blocks of language models.
With the proposed architecture, the text encoding and the graph aggregation are fused into an iterative workflow.
In addition, a progressive learning strategy is introduced, where the model is successively trained on manipulated data and original data to reinforce its capability of integrating information on graph.
arXiv Detail & Related papers (2021-05-06T12:20:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.