Communication-Efficient Robust Federated Learning with Noisy Labels
- URL: http://arxiv.org/abs/2206.05558v1
- Date: Sat, 11 Jun 2022 16:21:17 GMT
- Title: Communication-Efficient Robust Federated Learning with Noisy Labels
- Authors: Junyi Li, Jian Pei, Heng Huang
- Abstract summary: Federated learning (FL) is a promising privacy-preserving machine learning paradigm over distributed located data.
We propose a learning-based reweighting approach to mitigate the effect of noisy labels in FL.
Our approach has shown superior performance on several real-world datasets compared to various baselines.
- Score: 144.31995882209932
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning (FL) is a promising privacy-preserving machine learning
paradigm over distributed located data. In FL, the data is kept locally by each
user. This protects the user privacy, but also makes the server difficult to
verify data quality, especially if the data are correctly labeled. Training
with corrupted labels is harmful to the federated learning task; however,
little attention has been paid to FL in the case of label noise. In this paper,
we focus on this problem and propose a learning-based reweighting approach to
mitigate the effect of noisy labels in FL. More precisely, we tuned a weight
for each training sample such that the learned model has optimal generalization
performance over a validation set. More formally, the process can be formulated
as a Federated Bilevel Optimization problem. Bilevel optimization problem is a
type of optimization problem with two levels of entangled problems. The
non-distributed bilevel problems have witnessed notable progress recently with
new efficient algorithms. However, solving bilevel optimization problems under
the Federated Learning setting is under-investigated. We identify that the high
communication cost in hypergradient evaluation is the major bottleneck. So we
propose \textit{Comm-FedBiO} to solve the general Federated Bilevel
Optimization problems; more specifically, we propose two
communication-efficient subroutines to estimate the hypergradient. Convergence
analysis of the proposed algorithms is also provided. Finally, we apply the
proposed algorithms to solve the noisy label problem. Our approach has shown
superior performance on several real-world datasets compared to various
baselines.
Related papers
- A Primal-Dual-Assisted Penalty Approach to Bilevel Optimization with Coupled Constraints [66.61399765513383]
We develop a BLOCC algorithm to tackle BiLevel Optimization problems with Coupled Constraints.
We demonstrate its effectiveness on two well-known real-world applications.
arXiv Detail & Related papers (2024-06-14T15:59:36Z) - Communication Efficient and Provable Federated Unlearning [43.178460522012934]
We study federated unlearning, a novel problem to eliminate the impact of specific clients or data points on the global model learned via federated learning (FL)
This problem is driven by the right to be forgotten and the privacy challenges in FL.
We introduce a new framework for exact federated unlearning that meets two essential criteria: textitcommunication efficiency and textitexact unlearning provability.
arXiv Detail & Related papers (2024-01-19T20:35:02Z) - Communication-Efficient Federated Bilevel Optimization with Local and
Global Lower Level Problems [118.00379425831566]
We propose a communication-efficient algorithm, named FedBiOAcc.
We prove that FedBiOAcc-Local converges at the same rate for this type of problems.
Empirical results show superior performance of our algorithms.
arXiv Detail & Related papers (2023-02-13T21:28:53Z) - Asynchronous Distributed Bilevel Optimization [20.074079852690048]
We propose Asynchronous Distributed Bilevel (ADBO) algorithm to tackle bilevel optimization problems.
The complexity of ADBO to obtain the $epsilon$-stationary point is upper bounded by $mathcalO(frac1epsilon 2)$.
arXiv Detail & Related papers (2022-12-20T07:44:48Z) - Fast Adaptive Federated Bilevel Optimization [14.579475552088692]
We propose a novel adaptive federated bilevel optimization algorithm (i.e.,AdaFBiO) to solve the distributed bilevel optimization problems.
AdaFBiO uses the unified adaptive matrices to flexibly incorporate various adaptive learning rates to update variables in both UL and LL problems.
We provide a convergence analysis framework for our AdaFBiO algorithm, and prove it needs the sample of complexity of $tildeO(epsilon-3)$ with communication complexity of $tildeO(epsilon-2)$ to obtain an $
arXiv Detail & Related papers (2022-11-02T13:55:47Z) - Local Stochastic Bilevel Optimization with Momentum-Based Variance
Reduction [104.41634756395545]
We study Federated Bilevel Optimization problems. Specifically, we first propose the FedBiO, a deterministic gradient-based algorithm.
We show FedBiO has complexity of $O(epsilon-1.5)$.
Our algorithms show superior performances compared to other baselines in numerical experiments.
arXiv Detail & Related papers (2022-05-03T16:40:22Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Enhanced Bilevel Optimization via Bregman Distance [104.96004056928474]
We propose a bilevel optimization method based on Bregman Bregman functions.
We also propose an accelerated version of SBiO-BreD method (ASBiO-BreD) by using the variance-reduced technique.
arXiv Detail & Related papers (2021-07-26T16:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.