Federated Learning From Big Data Over Networks
- URL: http://arxiv.org/abs/2010.14159v1
- Date: Tue, 27 Oct 2020 09:39:46 GMT
- Title: Federated Learning From Big Data Over Networks
- Authors: Y. Sarcheshmehpour, M. Leinonen and A. Jung
- Abstract summary: This paper formulates and studies a novel algorithm for federated learning from large collections of local datasets.
We model such big data over networks using a networked linear regression model.
We obtain a distributed federated learning algorithm via a message passing implementation of this primal-dual method.
- Score: 1.2891210250935146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper formulates and studies a novel algorithm for federated learning
from large collections of local datasets. This algorithm capitalizes on an
intrinsic network structure that relates the local datasets via an undirected
"empirical" graph. We model such big data over networks using a networked
linear regression model. Each local dataset has individual regression weights.
The weights of close-knit sub-collections of local datasets are enforced to
deviate only little. This lends naturally to a network Lasso problem which we
solve using a primal-dual method. We obtain a distributed federated learning
algorithm via a message passing implementation of this primal-dual method. We
provide a detailed analysis of the statistical and computational properties of
the resulting federated learning algorithm.
Related papers
- Data Augmentations in Deep Weight Spaces [89.45272760013928]
We introduce a novel augmentation scheme based on the Mixup method.
We evaluate the performance of these techniques on existing benchmarks as well as new benchmarks we generate.
arXiv Detail & Related papers (2023-11-15T10:43:13Z) - Tackling Computational Heterogeneity in FL: A Few Theoretical Insights [68.8204255655161]
We introduce and analyse a novel aggregation framework that allows for formalizing and tackling computational heterogeneous data.
Proposed aggregation algorithms are extensively analyzed from a theoretical, and an experimental prospective.
arXiv Detail & Related papers (2023-07-12T16:28:21Z) - Towards Model-Agnostic Federated Learning over Networks [0.0]
We present a model-agnostic federated learning method for networks of heterogeneous data and models.
Our method is an instance of empirical risk minimization, with the regularization term derived from the network structure of data.
arXiv Detail & Related papers (2023-02-08T22:55:57Z) - Network Gradient Descent Algorithm for Decentralized Federated Learning [0.2867517731896504]
We study a fully decentralized federated learning algorithm, which is a novel descent gradient algorithm executed on a communication-based network.
In the NGD method, only statistics (e.g., parameter estimates) need to be communicated, minimizing the risk of privacy.
We find that both the learning rate and the network structure play significant roles in determining the NGD estimator's statistical efficiency.
arXiv Detail & Related papers (2022-05-06T02:53:31Z) - Clustered Federated Learning via Generalized Total Variation
Minimization [83.26141667853057]
We study optimization methods to train local (or personalized) models for local datasets with a decentralized network structure.
Our main conceptual contribution is to formulate federated learning as total variation minimization (GTV)
Our main algorithmic contribution is a fully decentralized federated learning algorithm.
arXiv Detail & Related papers (2021-05-26T18:07:19Z) - Exploiting Shared Representations for Personalized Federated Learning [54.65133770989836]
We propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.
Our algorithm harnesses the distributed computational power across clients to perform many local-updates with respect to the low-dimensional local parameters for every update of the representation.
This result is of interest beyond federated learning to a broad class of problems in which we aim to learn a shared low-dimensional representation among data distributions.
arXiv Detail & Related papers (2021-02-14T05:36:25Z) - Robustness to Missing Features using Hierarchical Clustering with Split
Neural Networks [39.29536042476913]
We propose a simple yet effective approach that clusters similar input features together using hierarchical clustering.
We evaluate this approach on a series of benchmark datasets and show promising improvements even with simple imputation techniques.
arXiv Detail & Related papers (2020-11-19T00:35:08Z) - Pre-Trained Models for Heterogeneous Information Networks [57.78194356302626]
We propose a self-supervised pre-training and fine-tuning framework, PF-HIN, to capture the features of a heterogeneous information network.
PF-HIN consistently and significantly outperforms state-of-the-art alternatives on each of these tasks, on four datasets.
arXiv Detail & Related papers (2020-07-07T03:36:28Z) - FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity
to Non-IID Data [59.50904660420082]
Federated Learning (FL) has become a popular paradigm for learning from distributed data.
To effectively utilize data at different devices without moving them to the cloud, algorithms such as the Federated Averaging (FedAvg) have adopted a "computation then aggregation" (CTA) model.
arXiv Detail & Related papers (2020-05-22T23:07:42Z) - Fast local linear regression with anchor regularization [21.739281173516247]
We propose a simple yet effective local model training algorithm called the fast anchor regularized local linear method (FALL)
Through experiments on synthetic and real-world datasets, we demonstrate that FALL compares favorably in terms of accuracy with the state-of-the-art network Lasso algorithm.
arXiv Detail & Related papers (2020-02-21T10:03:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.