FedMT: Federated Learning with Mixed-type Labels
- URL: http://arxiv.org/abs/2210.02042v4
- Date: Thu, 15 Feb 2024 16:58:34 GMT
- Title: FedMT: Federated Learning with Mixed-type Labels
- Authors: Qiong Zhang, Jing Peng, Xin Zhang, Aline Talhouk, Gang Niu, Xiaoxiao
Li
- Abstract summary: In federated learning (FL), classifiers are trained on datasets from multiple data centers without exchanging data across them.
This limitation becomes particularly notable in domains like disease diagnosis, where different clinical centers may adhere to different standards.
This paper addresses this important yet under-explored setting of FL, namely FL with mixed-type labels.
We introduce a model-agnostic approach called FedMT, which estimates label space correspondences and projects classification scores to construct loss functions.
- Score: 41.272679061636794
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In federated learning (FL), classifiers (e.g., deep networks) are trained on
datasets from multiple data centers without exchanging data across them, which
improves the sample efficiency. However, the conventional FL setting assumes
the same labeling criterion in all data centers involved, thus limiting its
practical utility. This limitation becomes particularly notable in domains like
disease diagnosis, where different clinical centers may adhere to different
standards, making traditional FL methods unsuitable. This paper addresses this
important yet under-explored setting of FL, namely FL with mixed-type labels,
where the allowance of different labeling criteria introduces inter-center
label space differences. To address this challenge effectively and efficiently,
we introduce a model-agnostic approach called FedMT, which estimates label
space correspondences and projects classification scores to construct loss
functions. The proposed FedMT is versatile and integrates seamlessly with
various FL methods, such as FedAvg. Experimental results on benchmark and
medical datasets highlight the substantial improvement in classification
accuracy achieved by FedMT in the presence of mixed-type labels.
Related papers
- Benchmarking Federated Learning for Semantic Datasets: Federated Scene Graph Generation [3.499870393443268]
Federated learning (FL) has recently garnered attention as a data-decentralized training framework.
We propose a benchmark process to establish an FL benchmark with controllable semantic heterogeneity across clients.
As a proof of concept, we first construct a federated PSG benchmark, demonstrating the efficacy of the existing PSG methods in an FL setting.
arXiv Detail & Related papers (2024-12-11T08:10:46Z) - On the Impact of Data Heterogeneity in Federated Learning Environments with Application to Healthcare Networks [3.9058850780464884]
Federated Learning (FL) allows privacy-sensitive applications to leverage their dataset for a global model construction without any disclosure of the information.
One of those domains is healthcare, where groups of silos collaborate in order to generate a global predictor with improved accuracy and generalization.
This paper presents a comprehensive exploration of the mathematical formalization and taxonomy of heterogeneity within FL environments, focusing on the intricacies of medical data.
arXiv Detail & Related papers (2024-04-29T09:05:01Z) - FedAnchor: Enhancing Federated Semi-Supervised Learning with Label
Contrastive Loss for Unlabeled Clients [19.3885479917635]
Federated learning (FL) is a distributed learning paradigm that facilitates collaborative training of a shared global model across devices.
We propose FedAnchor, an innovative FSSL method that introduces a unique double-head structure, called anchor head, paired with the classification head trained exclusively on labeled anchor data on the server.
Our approach mitigates the confirmation bias and overfitting issues associated with pseudo-labeling techniques based on high-confidence model prediction samples.
arXiv Detail & Related papers (2024-02-15T18:48:21Z) - FedNoisy: Federated Noisy Label Learning Benchmark [53.73816587601204]
Federated learning has gained popularity for distributed learning without aggregating sensitive data from clients.
The distributed and isolated nature of data isolation may be complicated by data quality, making it more vulnerable to noisy labels.
We serve the first standardized benchmark that can help researchers fully explore potential federated noisy settings.
arXiv Detail & Related papers (2023-06-20T16:18:14Z) - FedWon: Triumphing Multi-domain Federated Learning Without Normalization [50.49210227068574]
Federated learning (FL) enhances data privacy with collaborative in-situ training on decentralized clients.
However, Federated learning (FL) encounters challenges due to non-independent and identically distributed (non-i.i.d) data.
We propose a novel method called Federated learning Without normalizations (FedWon) to address the multi-domain problem in FL.
arXiv Detail & Related papers (2023-06-09T13:18:50Z) - Optimizing Federated Learning for Medical Image Classification on
Distributed Non-iid Datasets with Partial Labels [3.6704226968275258]
FedFBN is a federated learning framework that draws inspiration from transfer learning by using pretrained networks as the model backend.
We evaluate FedFBN with current FL strategies using synthetic iid toy datasets and large-scale non-iid datasets across scenarios with partial and complete labels.
arXiv Detail & Related papers (2023-03-10T19:23:33Z) - FLAG: Fast Label-Adaptive Aggregation for Multi-label Classification in
Federated Learning [1.4280238304844592]
This study proposes a new multi-label federated learning framework with a Clustering-based Multi-label Data Allocation (CMDA) and a novel aggregation method, Fast Label-Adaptive Aggregation (FLAG)
The experimental results demonstrate that our methods only need less than 50% of training epochs and communication rounds to surpass the performance of state-of-the-art federated learning methods.
arXiv Detail & Related papers (2023-02-27T08:16:39Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Federated Semi-supervised Medical Image Classification via Inter-client
Relation Matching [58.26619456972598]
Federated learning (FL) has emerged with increasing popularity to collaborate distributed medical institutions for training deep networks.
This paper studies a practical yet challenging FL problem, named textitFederated Semi-supervised Learning (FSSL)
We present a novel approach for this problem, which improves over traditional consistency regularization mechanism with a new inter-client relation matching scheme.
arXiv Detail & Related papers (2021-06-16T07:58:00Z) - PseudoSeg: Designing Pseudo Labels for Semantic Segmentation [78.35515004654553]
We present a re-design of pseudo-labeling to generate structured pseudo labels for training with unlabeled or weakly-labeled data.
We demonstrate the effectiveness of the proposed pseudo-labeling strategy in both low-data and high-data regimes.
arXiv Detail & Related papers (2020-10-19T17:59:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.