Federated Hetero-Task Learning
- URL: http://arxiv.org/abs/2206.03436v1
- Date: Tue, 7 Jun 2022 16:43:09 GMT
- Title: Federated Hetero-Task Learning
- Authors: Liuyi Yao, Dawei Gao, Zhen Wang, Yuexiang Xie, Weirui Kuang, Daoyuan
Chen, Haohui Wang, Chenhe Dong, Bolin Ding, Yaliang Li
- Abstract summary: We present B-FHTL, a federated hetero-task learning benchmark consisted of simulation dataset, FL protocols and a unified evaluation mechanism.
To ensure fair comparison among different FL algorithms, B-FHTL builds in a full suite of FL protocols.
We compare the FL algorithms in fields of federated multi-task learning, federated personalization and federated meta learning within B-FHTL.
- Score: 42.985155807178685
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: To investigate the heterogeneity of federated learning in real-world
scenarios, we generalize the classical federated learning to federated
hetero-task learning, which emphasizes the inconsistency across the
participants in federated learning in terms of both data distribution and
learning tasks. We also present B-FHTL, a federated hetero-task learning
benchmark consisted of simulation dataset, FL protocols and a unified
evaluation mechanism. B-FHTL dataset contains three well-designed federated
learning tasks with increasing heterogeneity. Each task simulates the clients
with different data distributions and learning tasks. To ensure fair comparison
among different FL algorithms, B-FHTL builds in a full suite of FL protocols by
providing high-level APIs to avoid privacy leakage, and presets most common
evaluation metrics spanning across different learning tasks, such as
regression, classification, text generation and etc. Furthermore, we compare
the FL algorithms in fields of federated multi-task learning, federated
personalization and federated meta learning within B-FHTL, and highlight the
influence of heterogeneity and difficulties of federated hetero-task learning.
Our benchmark, including the federated dataset, protocols, the evaluation
mechanism and the preliminary experiment, is open-sourced at
https://github.com/alibaba/FederatedScope/tree/contest/v1.0.
Related papers
- Comparative Evaluation of Clustered Federated Learning Methods [0.5242869847419834]
Clustered Federated Learning (CFL) aims to partition clients into groups where the distribution are homogeneous.
In this paper, we explore the performance of two state-of-theart CFL algorithms with respect to a proposed taxonomy of data heterogeneities in federated learning (FL)
Our objective is to provide a clearer understanding of the relationship between CFL performances and data heterogeneous scenarios.
arXiv Detail & Related papers (2024-10-18T07:01:56Z) - Addressing Skewed Heterogeneity via Federated Prototype Rectification with Personalization [35.48757125452761]
Federated learning is an efficient framework designed to facilitate collaborative model training across multiple distributed devices.
A significant challenge of federated learning is data-level heterogeneity, i.e., skewed or long-tailed distribution of private data.
We propose a novel Federated Prototype Rectification with Personalization which consists of two parts: Federated Personalization and Federated Prototype Rectification.
arXiv Detail & Related papers (2024-08-15T06:26:46Z) - Efficient Cluster Selection for Personalized Federated Learning: A
Multi-Armed Bandit Approach [2.5477011559292175]
Federated learning (FL) offers a decentralized training approach for machine learning models, prioritizing data privacy.
In this paper, we introduce a dynamic Upper Confidence Bound (dUCB) algorithm inspired by the multi-armed bandit (MAB) approach.
arXiv Detail & Related papers (2023-10-29T16:46:50Z) - Generalizable Heterogeneous Federated Cross-Correlation and Instance
Similarity Learning [60.058083574671834]
This paper presents a novel FCCL+, federated correlation and similarity learning with non-target distillation.
For heterogeneous issue, we leverage irrelevant unlabeled public data for communication.
For catastrophic forgetting in local updating stage, FCCL+ introduces Federated Non Target Distillation.
arXiv Detail & Related papers (2023-09-28T09:32:27Z) - UNIDEAL: Curriculum Knowledge Distillation Federated Learning [17.817181326740698]
Federated Learning (FL) has emerged as a promising approach to enable collaborative learning among multiple clients.
In this paper, we present UNI, a novel FL algorithm specifically designed to tackle the challenges of cross-domain scenarios.
Our results demonstrate that UNI achieves superior performance in terms of both model accuracy and communication efficiency.
arXiv Detail & Related papers (2023-09-16T11:30:29Z) - Collaborating Heterogeneous Natural Language Processing Tasks via
Federated Learning [55.99444047920231]
The proposed ATC framework achieves significant improvements compared with various baseline methods.
We conduct extensive experiments on six widely-used datasets covering both Natural Language Understanding (NLU) and Natural Language Generation (NLG) tasks.
arXiv Detail & Related papers (2022-12-12T09:27:50Z) - FedGradNorm: Personalized Federated Gradient-Normalized Multi-Task
Learning [50.756991828015316]
Multi-task learning (MTL) is a novel framework to learn several tasks simultaneously with a single shared network.
We propose FedGradNorm which uses a dynamic-weighting method to normalize norms in order to balance learning speeds among different tasks.
arXiv Detail & Related papers (2022-03-24T17:43:12Z) - CoFED: Cross-silo Heterogeneous Federated Multi-task Learning via
Co-training [11.198612582299813]
Federated Learning (FL) is a machine learning technique that enables participants to train high-quality models collaboratively without exchanging their private data.
We propose a communication-efficient FL scheme, CoFED, based on pseudo-labeling unlabeled data like co-training.
Experimental results show that CoFED achieves better performance with a lower communication cost.
arXiv Detail & Related papers (2022-02-17T11:34:20Z) - Heterogeneous Federated Learning via Grouped Sequential-to-Parallel
Training [60.892342868936865]
Federated learning (FL) is a rapidly growing privacy-preserving collaborative machine learning paradigm.
We propose a data heterogeneous-robust FL approach, FedGSP, to address this challenge.
We show that FedGSP improves the accuracy by 3.7% on average compared with seven state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-31T03:15:28Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.