Heterogeneous Federated Learning
- URL: http://arxiv.org/abs/2008.06767v2
- Date: Sun, 20 Mar 2022 03:19:33 GMT
- Title: Heterogeneous Federated Learning
- Authors: Fuxun Yu, Weishan Zhang, Zhuwei Qin, Zirui Xu, Di Wang, Chenchen Liu,
Zhi Tian, Xiang Chen
- Abstract summary: Federated learning learns from scattered data by fusing collaborative models from local nodes.
Due to chaotic information distribution, the model fusion may suffer from structural misalignment with regard to unmatched parameters.
We propose a novel federated learning framework to establish a firm structure-information alignment across collaborative models.
- Score: 41.04946606973614
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning learns from scattered data by fusing collaborative models
from local nodes. However, due to chaotic information distribution, the model
fusion may suffer from structural misalignment with regard to unmatched
parameters. In this work, we propose a novel federated learning framework to
resolve this issue by establishing a firm structure-information alignment
across collaborative models. Specifically, we design a feature-oriented
regulation method ({$\Psi$-Net}) to ensure explicit feature information
allocation in different neural network structures. Applying this regulating
method to collaborative models, matchable structures with similar feature
information can be initialized at the very early training stage. During the
federated learning process under either IID or non-IID scenarios, dedicated
collaboration schemes further guarantee ordered information distribution with
definite structure matching, so as the comprehensive model alignment.
Eventually, this framework effectively enhances the federated learning
applicability to extensive heterogeneous settings, while providing excellent
convergence speed, accuracy, and computation/communication efficiency.
Related papers
- Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - Personalizing Federated Learning with Over-the-Air Computations [84.8089761800994]
Federated edge learning is a promising technology to deploy intelligence at the edge of wireless networks in a privacy-preserving manner.
Under such a setting, multiple clients collaboratively train a global generic model under the coordination of an edge server.
This paper presents a distributed training paradigm that employs analog over-the-air computation to address the communication bottleneck.
arXiv Detail & Related papers (2023-02-24T08:41:19Z) - Heterogeneous Ensemble Knowledge Transfer for Training Large Models in
Federated Learning [22.310090483499035]
Federated learning (FL) enables edge-devices to collaboratively learn a model without disclosing their private data to a central aggregating server.
Most existing FL algorithms require models of identical architecture to be deployed across the clients and server.
We propose a novel ensemble knowledge transfer method named Fed-ET in which small models are trained on clients, and used to train a larger model at the server.
arXiv Detail & Related papers (2022-04-27T05:18:32Z) - A Framework for Verifiable and Auditable Federated Anomaly Detection [3.639790324866155]
Federated Leaning is an emerging approach to manage cooperation between a group of agents for the solution of Machine Learning tasks.
We present a novel algorithmic architecture that tackle this problem in the particular case of Anomaly Detection.
arXiv Detail & Related papers (2022-03-15T11:34:02Z) - Personalized Federated Learning With Structure [24.566947384179837]
We propose a novel structured federated learning(SFL) framework to simultaneously learn the global model and personalized model.
In contrast to a pre-defined structure, our framework could be further enhanced by adding a structure learning component to automatically learn the structure.
arXiv Detail & Related papers (2022-03-02T02:43:51Z) - Fed2: Feature-Aligned Federated Learning [32.54574459692627]
Federated learning learns from scattered data by fusing collaborative models from local nodes.
We propose Fed2, a feature-aligned federated learning framework to resolve this issue by establishing a firm structure-feature alignment.
Fed2 could effectively enhance the federated learning convergence performance under extensive homo- and heterogeneous settings.
arXiv Detail & Related papers (2021-11-28T22:21:48Z) - Clustered Federated Learning via Generalized Total Variation
Minimization [83.26141667853057]
We study optimization methods to train local (or personalized) models for local datasets with a decentralized network structure.
Our main conceptual contribution is to formulate federated learning as total variation minimization (GTV)
Our main algorithmic contribution is a fully decentralized federated learning algorithm.
arXiv Detail & Related papers (2021-05-26T18:07:19Z) - Edge-assisted Democratized Learning Towards Federated Analytics [67.44078999945722]
We show the hierarchical learning structure of the proposed edge-assisted democratized learning mechanism, namely Edge-DemLearn.
We also validate Edge-DemLearn as a flexible model training mechanism to build a distributed control and aggregation methodology in regions.
arXiv Detail & Related papers (2020-12-01T11:46:03Z) - Federated Residual Learning [53.77128418049985]
We study a new form of federated learning where the clients train personalized local models and make predictions jointly with the server-side shared model.
Using this new federated learning framework, the complexity of the central shared model can be minimized while still gaining all the performance benefits that joint training provides.
arXiv Detail & Related papers (2020-03-28T19:55:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.