Revisiting Early-Learning Regularization When Federated Learning Meets
Noisy Labels
- URL: http://arxiv.org/abs/2402.05353v1
- Date: Thu, 8 Feb 2024 02:21:33 GMT
- Title: Revisiting Early-Learning Regularization When Federated Learning Meets
Noisy Labels
- Authors: Taehyeon Kim, Donggyu Kim, Se-Young Yun
- Abstract summary: This paper revisits early-learning regularization, introducing an innovative strategy, Federated Label-mixture Regularization (FLR)
FLR adeptly adapts to FL's complexities by generating new pseudo labels, blending local and global model predictions.
- Score: 27.777781072683986
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the evolving landscape of federated learning (FL), addressing label noise
presents unique challenges due to the decentralized and diverse nature of data
collection across clients. Traditional centralized learning approaches to
mitigate label noise are constrained in FL by privacy concerns and the
heterogeneity of client data. This paper revisits early-learning
regularization, introducing an innovative strategy, Federated Label-mixture
Regularization (FLR). FLR adeptly adapts to FL's complexities by generating new
pseudo labels, blending local and global model predictions. This method not
only enhances the accuracy of the global model in both i.i.d. and non-i.i.d.
settings but also effectively counters the memorization of noisy labels.
Demonstrating compatibility with existing label noise and FL techniques, FLR
paves the way for improved generalization in FL environments fraught with label
inaccuracies.
Related papers
- FLea: Addressing Data Scarcity and Label Skew in Federated Learning via Privacy-preserving Feature Augmentation [15.298650496155508]
Federated Learning (FL) enables model development by leveraging data distributed across numerous edge devices without transferring local data to a central server.
Existing FL methods face challenges when dealing with scarce and label-skewed data across devices, resulting in local model overfitting and drift.
We propose a pioneering framework called FLea, incorporating the following key components.
arXiv Detail & Related papers (2024-06-13T19:28:08Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - Federated Learning with Instance-Dependent Noisy Label [6.093214616626228]
FedBeat aims to build a global statistically consistent classifier using the IDN transition matrix (IDNTM)
Experiments conducted on CIFAR-10 and SVHN verify that the proposed method significantly outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-16T05:08:02Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - FedNoisy: Federated Noisy Label Learning Benchmark [53.73816587601204]
Federated learning has gained popularity for distributed learning without aggregating sensitive data from clients.
The distributed and isolated nature of data isolation may be complicated by data quality, making it more vulnerable to noisy labels.
We serve the first standardized benchmark that can help researchers fully explore potential federated noisy settings.
arXiv Detail & Related papers (2023-06-20T16:18:14Z) - Learning Cautiously in Federated Learning with Noisy and Heterogeneous
Clients [4.782145666637457]
Federated learning (FL) is a distributed framework for collaboratively training with privacy guarantees.
In real-world scenarios, clients may have Non-IID data (local class imbalance) with poor annotation quality (label noise)
We propose FedCNI without using an additional clean proxy dataset.
It includes a noise-resilient local solver and a robust global aggregator.
arXiv Detail & Related papers (2023-04-06T06:47:14Z) - Quantifying the Impact of Label Noise on Federated Learning [7.531486350989069]
Federated Learning (FL) is a distributed machine learning paradigm where clients collaboratively train a model using their local (human-generated) datasets.
This paper provides a quantitative study on the impact of label noise on FL.
Our empirical results show that the global model accuracy linearly decreases as the noise level increases.
arXiv Detail & Related papers (2022-11-15T00:40:55Z) - FedNoiL: A Simple Two-Level Sampling Method for Federated Learning with
Noisy Labels [49.47228898303909]
Federated learning (FL) aims at training a global model on the server side while the training data are collected and located at the local devices.
Local training on noisy labels can easily result in overfitting to noisy labels, which is devastating to the global model through aggregation.
We develop a simple two-level sampling method "FedNoiL" that selects clients for more robust global aggregation on the server.
arXiv Detail & Related papers (2022-05-20T12:06:39Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.