Linearization Algorithms for Fully Composite Optimization
- URL: http://arxiv.org/abs/2302.12808v2
- Date: Wed, 12 Jul 2023 14:07:17 GMT
- Title: Linearization Algorithms for Fully Composite Optimization
- Authors: Maria-Luiza Vladarean, Nikita Doikov, Martin Jaggi, Nicolas Flammarion
- Abstract summary: This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
- Score: 61.20539085730636
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies first-order algorithms for solving fully composite
optimization problems over convex and compact sets. We leverage the structure
of the objective by handling its differentiable and non-differentiable
components separately, linearizing only the smooth parts. This provides us with
new generalizations of the classical Frank-Wolfe method and the Conditional
Gradient Sliding algorithm, that cater to a subclass of non-differentiable
problems. Our algorithms rely on a stronger version of the linear minimization
oracle, which can be efficiently implemented in several practical applications.
We provide the basic version of our method with an affine-invariant analysis
and prove global convergence rates for both convex and non-convex objectives.
Furthermore, in the convex case, we propose an accelerated method with
correspondingly improved complexity. Finally, we provide illustrative
experiments to support our theoretical results.
Related papers
- Towards Differentiable Multilevel Optimization: A Gradient-Based Approach [1.6114012813668932]
This paper introduces a novel gradient-based approach for multilevel optimization.
Our method significantly reduces computational complexity while improving both solution accuracy and convergence speed.
To the best of our knowledge, this is one of the first algorithms to provide a general version of implicit differentiation.
arXiv Detail & Related papers (2024-10-15T06:17:59Z) - Projection-Free Variance Reduction Methods for Stochastic Constrained Multi-Level Compositional Optimization [34.628967272528044]
This paper investigates projection-free algorithms for constrained multi-level optimization functions.
By using a stage-wise adaptation, we obtain complexities for convex and strongly convex functions.
arXiv Detail & Related papers (2024-06-06T06:56:56Z) - Oracle Complexity of Single-Loop Switching Subgradient Methods for
Non-Smooth Weakly Convex Functional Constrained Optimization [12.84152722535866]
We consider a non- constrained optimization problem where the objective function is weakly convex or weakly convex.
To solve the problem, we consider the subgradient method, which is first-order tuning and easily implement.
arXiv Detail & Related papers (2023-01-30T22:13:14Z) - Adaptive Stochastic Optimisation of Nonconvex Composite Objectives [2.1700203922407493]
We propose and analyse a family of generalised composite mirror descent algorithms.
With adaptive step sizes, the proposed algorithms converge without requiring prior knowledge of the problem.
We exploit the low-dimensional structure of the decision sets for high-dimensional problems.
arXiv Detail & Related papers (2022-11-21T18:31:43Z) - On a class of geodesically convex optimization problems solved via
Euclidean MM methods [50.428784381385164]
We show how a difference of Euclidean convexization functions can be written as a difference of different types of problems in statistics and machine learning.
Ultimately, we helps the broader broader the broader the broader the broader the work.
arXiv Detail & Related papers (2022-06-22T23:57:40Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - SUPER-ADAM: Faster and Universal Framework of Adaptive Gradients [99.13839450032408]
It is desired to design a universal framework for adaptive algorithms to solve general problems.
In particular, our novel framework provides adaptive methods under non convergence support for setting.
arXiv Detail & Related papers (2021-06-15T15:16:28Z) - Distributed Proximal Splitting Algorithms with Rates and Acceleration [7.691755449724637]
We derive sublinear and linear convergence results with new rates on the function value suboptimality or distance to the solution.
We propose distributed variants of these algorithms, which can be accelerated as well.
arXiv Detail & Related papers (2020-10-02T12:35:09Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.