Communication-Efficient Federated Bilevel Optimization with Local and
Global Lower Level Problems
- URL: http://arxiv.org/abs/2302.06701v2
- Date: Tue, 27 Feb 2024 06:47:19 GMT
- Title: Communication-Efficient Federated Bilevel Optimization with Local and
Global Lower Level Problems
- Authors: Junyi Li, Feihu Huang, Heng Huang
- Abstract summary: We propose a communication-efficient algorithm, named FedBiOAcc.
We prove that FedBiOAcc-Local converges at the same rate for this type of problems.
Empirical results show superior performance of our algorithms.
- Score: 118.00379425831566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bilevel Optimization has witnessed notable progress recently with new
emerging efficient algorithms. However, its application in the Federated
Learning setting remains relatively underexplored, and the impact of Federated
Learning's inherent challenges on the convergence of bilevel algorithms remain
obscure. In this work, we investigate Federated Bilevel Optimization problems
and propose a communication-efficient algorithm, named FedBiOAcc. The algorithm
leverages an efficient estimation of the hyper-gradient in the distributed
setting and utilizes the momentum-based variance-reduction acceleration.
Remarkably, FedBiOAcc achieves a communication complexity $O(\epsilon^{-1})$, a
sample complexity $O(\epsilon^{-1.5})$ and the linear speed up with respect to
the number of clients. We also analyze a special case of the Federated Bilevel
Optimization problems, where lower level problems are locally managed by
clients. We prove that FedBiOAcc-Local, a modified version of FedBiOAcc,
converges at the same rate for this type of problems. Finally, we validate the
proposed algorithms through two real-world tasks: Federated Data-cleaning and
Federated Hyper-representation Learning. Empirical results show superior
performance of our algorithms.
Related papers
- A Primal-Dual-Assisted Penalty Approach to Bilevel Optimization with Coupled Constraints [66.61399765513383]
We develop a BLOCC algorithm to tackle BiLevel Optimization problems with Coupled Constraints.
We demonstrate its effectiveness on two well-known real-world applications.
arXiv Detail & Related papers (2024-06-14T15:59:36Z) - Federated Compositional Deep AUC Maximization [58.25078060952361]
We develop a novel federated learning method for imbalanced data by directly optimizing the area under curve (AUC) score.
To the best of our knowledge, this is the first work to achieve such favorable theoretical results.
arXiv Detail & Related papers (2023-04-20T05:49:41Z) - Achieving Linear Speedup in Non-IID Federated Bilevel Learning [16.56643290676128]
We propose a new federated bilevel algorithm named FedMBO.
We show that FedMBO achieves a convergence rate of $mathcalObig(frac1sqrtnK+frac1K+fracsqrtnK3/2big)$ on non-i.i.d.datasets.
This is the first theoretical linear speedup result for non-i.i.d.federated bilevel optimization.
arXiv Detail & Related papers (2023-02-10T18:28:00Z) - Asynchronous Distributed Bilevel Optimization [20.074079852690048]
We propose Asynchronous Distributed Bilevel (ADBO) algorithm to tackle bilevel optimization problems.
The complexity of ADBO to obtain the $epsilon$-stationary point is upper bounded by $mathcalO(frac1epsilon 2)$.
arXiv Detail & Related papers (2022-12-20T07:44:48Z) - Fast Adaptive Federated Bilevel Optimization [14.579475552088692]
We propose a novel adaptive federated bilevel optimization algorithm (i.e.,AdaFBiO) to solve the distributed bilevel optimization problems.
AdaFBiO uses the unified adaptive matrices to flexibly incorporate various adaptive learning rates to update variables in both UL and LL problems.
We provide a convergence analysis framework for our AdaFBiO algorithm, and prove it needs the sample of complexity of $tildeO(epsilon-3)$ with communication complexity of $tildeO(epsilon-2)$ to obtain an $
arXiv Detail & Related papers (2022-11-02T13:55:47Z) - On the Convergence of Distributed Stochastic Bilevel Optimization
Algorithms over a Network [55.56019538079826]
Bilevel optimization has been applied to a wide variety of machine learning models.
Most existing algorithms restrict their single-machine setting so that they are incapable of handling distributed data.
We develop novel decentralized bilevel optimization algorithms based on a gradient tracking communication mechanism and two different gradients.
arXiv Detail & Related papers (2022-06-30T05:29:52Z) - Communication-Efficient Robust Federated Learning with Noisy Labels [144.31995882209932]
Federated learning (FL) is a promising privacy-preserving machine learning paradigm over distributed located data.
We propose a learning-based reweighting approach to mitigate the effect of noisy labels in FL.
Our approach has shown superior performance on several real-world datasets compared to various baselines.
arXiv Detail & Related papers (2022-06-11T16:21:17Z) - Local Stochastic Bilevel Optimization with Momentum-Based Variance
Reduction [104.41634756395545]
We study Federated Bilevel Optimization problems. Specifically, we first propose the FedBiO, a deterministic gradient-based algorithm.
We show FedBiO has complexity of $O(epsilon-1.5)$.
Our algorithms show superior performances compared to other baselines in numerical experiments.
arXiv Detail & Related papers (2022-05-03T16:40:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.