Bilevel Optimization for Machine Learning: Algorithm Design and
Convergence Analysis
- URL: http://arxiv.org/abs/2108.00330v1
- Date: Sat, 31 Jul 2021 22:05:47 GMT
- Title: Bilevel Optimization for Machine Learning: Algorithm Design and
Convergence Analysis
- Authors: Kaiyi Ji
- Abstract summary: This thesis provides a comprehensive convergence rate analysis for bilevel optimization algorithms.
For the problem-based formulation, we provide a convergence rate analysis for AID- and ITD-based bilevel algorithms.
We then develop acceleration bilevel algorithms, for which we provide shaper convergence analysis with relaxed assumptions.
- Score: 12.680169619392695
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bilevel optimization has become a powerful framework in various machine
learning applications including meta-learning, hyperparameter optimization, and
network architecture search. There are generally two classes of bilevel
optimization formulations for machine learning: 1) problem-based bilevel
optimization, whose inner-level problem is formulated as finding a minimizer of
a given loss function; and 2) algorithm-based bilevel optimization, whose
inner-level solution is an output of a fixed algorithm. For the first class,
two popular types of gradient-based algorithms have been proposed for
hypergradient estimation via approximate implicit differentiation (AID) and
iterative differentiation (ITD). Algorithms for the second class include the
popular model-agnostic meta-learning (MAML) and almost no inner loop (ANIL).
However, the convergence rate and fundamental limitations of bilevel
optimization algorithms have not been well explored.
This thesis provides a comprehensive convergence rate analysis for bilevel
algorithms in the aforementioned two classes. We further propose principled
algorithm designs for bilevel optimization with higher efficiency and
scalability. For the problem-based formulation, we provide a convergence rate
analysis for AID- and ITD-based bilevel algorithms. We then develop
acceleration bilevel algorithms, for which we provide shaper convergence
analysis with relaxed assumptions. We also provide the first lower bounds for
bilevel optimization, and establish the optimality by providing matching upper
bounds under certain conditions. We finally propose new stochastic bilevel
optimization algorithms with lower complexity and higher efficiency in
practice. For the algorithm-based formulation, we develop a theoretical
convergence for general multi-step MAML and ANIL, and characterize the impact
of parameter selections and loss geometries on the their complexities.
Related papers
- A Single-Loop Algorithm for Decentralized Bilevel Optimization [11.67135350286933]
We propose a novel single-loop algorithm for solving decentralized bilevel optimization with a strongly convex lower-level problem.
Our approach is a fully single-loop method that approximates the hypergradient using only two matrix-vector multiplications per iteration.
Our analysis demonstrates that the proposed algorithm achieves the best-known convergence rate for bilevel optimization algorithms.
arXiv Detail & Related papers (2023-11-15T13:29:49Z) - Decentralized Multi-Level Compositional Optimization Algorithms with Level-Independent Convergence Rate [26.676582181833584]
Decentralized multi-level optimization is challenging because of the multilevel structure and decentralized communication.
We develop two novel decentralized optimization algorithms to optimize the multi-level compositional problem.
arXiv Detail & Related papers (2023-06-06T00:23:28Z) - Fast Adaptive Federated Bilevel Optimization [14.579475552088692]
We propose a novel adaptive federated bilevel optimization algorithm (i.e.,AdaFBiO) to solve the distributed bilevel optimization problems.
AdaFBiO uses the unified adaptive matrices to flexibly incorporate various adaptive learning rates to update variables in both UL and LL problems.
We provide a convergence analysis framework for our AdaFBiO algorithm, and prove it needs the sample of complexity of $tildeO(epsilon-3)$ with communication complexity of $tildeO(epsilon-2)$ to obtain an $
arXiv Detail & Related papers (2022-11-02T13:55:47Z) - On the Convergence of Distributed Stochastic Bilevel Optimization
Algorithms over a Network [55.56019538079826]
Bilevel optimization has been applied to a wide variety of machine learning models.
Most existing algorithms restrict their single-machine setting so that they are incapable of handling distributed data.
We develop novel decentralized bilevel optimization algorithms based on a gradient tracking communication mechanism and two different gradients.
arXiv Detail & Related papers (2022-06-30T05:29:52Z) - A Constrained Optimization Approach to Bilevel Optimization with
Multiple Inner Minima [49.320758794766185]
We propose a new approach, which convert the bilevel problem to an equivalent constrained optimization, and then the primal-dual algorithm can be used to solve the problem.
Such an approach enjoys a few advantages including (a) addresses the multiple inner minima challenge; (b) fully first-order efficiency without Jacobian computations.
arXiv Detail & Related papers (2022-03-01T18:20:01Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Enhanced Bilevel Optimization via Bregman Distance [104.96004056928474]
We propose a bilevel optimization method based on Bregman Bregman functions.
We also propose an accelerated version of SBiO-BreD method (ASBiO-BreD) by using the variance-reduced technique.
arXiv Detail & Related papers (2021-07-26T16:18:43Z) - Provably Faster Algorithms for Bilevel Optimization [54.83583213812667]
Bilevel optimization has been widely applied in many important machine learning applications.
We propose two new algorithms for bilevel optimization.
We show that both algorithms achieve the complexity of $mathcalO(epsilon-1.5)$, which outperforms all existing algorithms by the order of magnitude.
arXiv Detail & Related papers (2021-06-08T21:05:30Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.