Tackling Noisy Clients in Federated Learning with End-to-end Label Correction
- URL: http://arxiv.org/abs/2408.04301v1
- Date: Thu, 8 Aug 2024 08:35:32 GMT
- Title: Tackling Noisy Clients in Federated Learning with End-to-end Label Correction
- Authors: Xuefeng Jiang, Sheng Sun, Jia Li, Jingjing Xue, Runhan Li, Zhiyuan Wu, Gang Xu, Yuwei Wang, Min Liu,
- Abstract summary: We propose a two-stage framework FedELC to tackle this complicated label noise issue.
The first stage aims to guide the detection of noisy clients with higher label noise.
The second stage aims to correct the labels of noisy clients' data via an end-to-end label correction framework.
- Score: 20.64304520865249
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recently, federated learning (FL) has achieved wide successes for diverse privacy-sensitive applications without sacrificing the sensitive private information of clients. However, the data quality of client datasets can not be guaranteed since corresponding annotations of different clients often contain complex label noise of varying degrees, which inevitably causes the performance degradation. Intuitively, the performance degradation is dominated by clients with higher noise rates since their trained models contain more misinformation from data, thus it is necessary to devise an effective optimization scheme to mitigate the negative impacts of these noisy clients. In this work, we propose a two-stage framework FedELC to tackle this complicated label noise issue. The first stage aims to guide the detection of noisy clients with higher label noise, while the second stage aims to correct the labels of noisy clients' data via an end-to-end label correction framework which is achieved by learning possible ground-truth labels of noisy clients' datasets via back propagation. We implement sixteen related methods and evaluate five datasets with three types of complicated label noise scenarios for a comprehensive comparison. Extensive experimental results demonstrate our proposed framework achieves superior performance than its counterparts for different scenarios. Additionally, we effectively improve the data quality of detected noisy clients' local datasets with our label correction framework. The code is available at https://github.com/Sprinter1999/FedELC.
Related papers
- Federated Learning Client Pruning for Noisy Labels [6.30126491637621]
Federated Learning (FL) enables collaborative model training across decentralized edge devices.
This paper introduces ClipFL, a novel framework addressing noisy labels from a fresh perspective.
It identifies and excludes noisy clients based on their performance on a clean validation dataset.
arXiv Detail & Related papers (2024-11-11T21:46:34Z) - FedFixer: Mitigating Heterogeneous Label Noise in Federated Learning [15.382625332503125]
Federated Learning (FL) heavily depends on label quality for its performance.
The high loss incurred by client-specific samples in heterogeneous label noise poses challenges for distinguishing between client-specific and noisy label samples.
We propose FedFixer, where the personalized model is introduced to cooperate with the global model to effectively select clean client-specific samples.
arXiv Detail & Related papers (2024-03-25T09:24:05Z) - Federated Learning with Extremely Noisy Clients via Negative
Distillation [70.13920804879312]
Federated learning (FL) has shown remarkable success in cooperatively training deep models, while struggling with noisy labels.
We propose a novel approach, called negative distillation (FedNed) to leverage models trained on noisy clients.
FedNed first identifies noisy clients and employs rather than discards the noisy clients in a knowledge distillation manner.
arXiv Detail & Related papers (2023-12-20T01:59:48Z) - FedDiv: Collaborative Noise Filtering for Federated Learning with Noisy
Labels [99.70895640578816]
Federated learning with noisy labels (F-LNL) aims at seeking an optimal server model via collaborative distributed learning.
We present FedDiv to tackle the challenges of F-LNL. Specifically, we propose a global noise filter called Federated Noise Filter.
arXiv Detail & Related papers (2023-12-19T15:46:47Z) - Neighborhood Collective Estimation for Noisy Label Identification and
Correction [92.20697827784426]
Learning with noisy labels (LNL) aims at designing strategies to improve model performance and generalization by mitigating the effects of model overfitting to noisy labels.
Recent advances employ the predicted label distributions of individual samples to perform noise verification and noisy label correction, easily giving rise to confirmation bias.
We propose Neighborhood Collective Estimation, in which the predictive reliability of a candidate sample is re-estimated by contrasting it against its feature-space nearest neighbors.
arXiv Detail & Related papers (2022-08-05T14:47:22Z) - FedCorr: Multi-Stage Federated Learning for Label Noise Correction [80.9366438220228]
Federated learning (FL) is a privacy-preserving distributed learning paradigm that enables clients to jointly train a global model.
We propose $textttFedCorr$, a general multi-stage framework to tackle heterogeneous label noise in FL.
Experiments conducted on CIFAR-10/100 with federated synthetic label noise, and on a real-world noisy dataset, Clothing1M, demonstrate that $textttFedCorr$ is robust to label noise.
arXiv Detail & Related papers (2022-04-10T12:51:18Z) - S3: Supervised Self-supervised Learning under Label Noise [53.02249460567745]
In this paper we address the problem of classification in the presence of label noise.
In the heart of our method is a sample selection mechanism that relies on the consistency between the annotated label of a sample and the distribution of the labels in its neighborhood in the feature space.
Our method significantly surpasses previous methods on both CIFARCIFAR100 with artificial noise and real-world noisy datasets such as WebVision and ANIMAL-10N.
arXiv Detail & Related papers (2021-11-22T15:49:20Z) - Federated Noisy Client Learning [105.00756772827066]
Federated learning (FL) collaboratively aggregates a shared global model depending on multiple local clients.
Standard FL methods ignore the noisy client issue, which may harm the overall performance of the aggregated model.
We propose Federated Noisy Client Learning (Fed-NCL), which is a plug-and-play algorithm and contains two main components.
arXiv Detail & Related papers (2021-06-24T11:09:17Z) - Audio Tagging by Cross Filtering Noisy Labels [26.14064793686316]
We present a novel framework, named CrossFilter, to combat the noisy labels problem for audio tagging.
Our method achieves state-of-the-art performance and even surpasses the ensemble models.
arXiv Detail & Related papers (2020-07-16T07:55:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.