Learning Cautiously in Federated Learning with Noisy and Heterogeneous
Clients
- URL: http://arxiv.org/abs/2304.02892v1
- Date: Thu, 6 Apr 2023 06:47:14 GMT
- Title: Learning Cautiously in Federated Learning with Noisy and Heterogeneous
Clients
- Authors: Chenrui Wu, Zexi Li, Fangxin Wang, Chao Wu
- Abstract summary: Federated learning (FL) is a distributed framework for collaboratively training with privacy guarantees.
In real-world scenarios, clients may have Non-IID data (local class imbalance) with poor annotation quality (label noise)
We propose FedCNI without using an additional clean proxy dataset.
It includes a noise-resilient local solver and a robust global aggregator.
- Score: 4.782145666637457
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is a distributed framework for collaboratively
training with privacy guarantees. In real-world scenarios, clients may have
Non-IID data (local class imbalance) with poor annotation quality (label
noise). The co-existence of label noise and class imbalance in FL's small local
datasets renders conventional FL methods and noisy-label learning methods both
ineffective. To address the challenges, we propose FedCNI without using an
additional clean proxy dataset. It includes a noise-resilient local solver and
a robust global aggregator. For the local solver, we design a more robust
prototypical noise detector to distinguish noisy samples. Further to reduce the
negative impact brought by the noisy samples, we devise a curriculum pseudo
labeling method and a denoise Mixup training strategy. For the global
aggregator, we propose a switching re-weighted aggregation method tailored to
different learning periods. Extensive experiments demonstrate our method can
substantially outperform state-of-the-art solutions in mix-heterogeneous FL
environments.
Related papers
- Collaboratively Learning Federated Models from Noisy Decentralized Data [21.3209961590772]
Federated learning (FL) has emerged as a prominent method for collaboratively training machine learning models using local data from edge devices.
We focus on addressing the problem of noisy data in the input space, an under-explored area compared to the label noise.
We propose a noise-aware FL aggregation method, namely Federated Noise-Sifting (FedNS), which can be used as a plug-in approach in conjunction with widely used FL strategies.
arXiv Detail & Related papers (2024-09-03T18:00:51Z) - FedDiv: Collaborative Noise Filtering for Federated Learning with Noisy
Labels [99.70895640578816]
Federated learning with noisy labels (F-LNL) aims at seeking an optimal server model via collaborative distributed learning.
We present FedDiv to tackle the challenges of F-LNL. Specifically, we propose a global noise filter called Federated Noise Filter.
arXiv Detail & Related papers (2023-12-19T15:46:47Z) - Federated Learning with Instance-Dependent Noisy Label [6.093214616626228]
FedBeat aims to build a global statistically consistent classifier using the IDN transition matrix (IDNTM)
Experiments conducted on CIFAR-10 and SVHN verify that the proposed method significantly outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-16T05:08:02Z) - Combating Label Noise With A General Surrogate Model For Sample
Selection [84.61367781175984]
We propose to leverage the vision-language surrogate model CLIP to filter noisy samples automatically.
We validate the effectiveness of our proposed method on both real-world and synthetic noisy datasets.
arXiv Detail & Related papers (2023-10-16T14:43:27Z) - Quantifying the Impact of Label Noise on Federated Learning [7.531486350989069]
Federated Learning (FL) is a distributed machine learning paradigm where clients collaboratively train a model using their local (human-generated) datasets.
This paper provides a quantitative study on the impact of label noise on FL.
Our empirical results show that the global model accuracy linearly decreases as the noise level increases.
arXiv Detail & Related papers (2022-11-15T00:40:55Z) - FedNoiL: A Simple Two-Level Sampling Method for Federated Learning with
Noisy Labels [49.47228898303909]
Federated learning (FL) aims at training a global model on the server side while the training data are collected and located at the local devices.
Local training on noisy labels can easily result in overfitting to noisy labels, which is devastating to the global model through aggregation.
We develop a simple two-level sampling method "FedNoiL" that selects clients for more robust global aggregation on the server.
arXiv Detail & Related papers (2022-05-20T12:06:39Z) - FedCorr: Multi-Stage Federated Learning for Label Noise Correction [80.9366438220228]
Federated learning (FL) is a privacy-preserving distributed learning paradigm that enables clients to jointly train a global model.
We propose $textttFedCorr$, a general multi-stage framework to tackle heterogeneous label noise in FL.
Experiments conducted on CIFAR-10/100 with federated synthetic label noise, and on a real-world noisy dataset, Clothing1M, demonstrate that $textttFedCorr$ is robust to label noise.
arXiv Detail & Related papers (2022-04-10T12:51:18Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Federated Noisy Client Learning [105.00756772827066]
Federated learning (FL) collaboratively aggregates a shared global model depending on multiple local clients.
Standard FL methods ignore the noisy client issue, which may harm the overall performance of the aggregated model.
We propose Federated Noisy Client Learning (Fed-NCL), which is a plug-and-play algorithm and contains two main components.
arXiv Detail & Related papers (2021-06-24T11:09:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.