A Simple yet Universal Strategy for Online Convex Optimization
- URL: http://arxiv.org/abs/2105.03681v1
- Date: Sat, 8 May 2021 11:43:49 GMT
- Title: A Simple yet Universal Strategy for Online Convex Optimization
- Authors: Lijun Zhang, Guanghui Wang, Jinfeng Yi, Tianbao Yang
- Abstract summary: We propose a simple strategy for universal online convex optimization.
The key idea is to construct a set of experts to process the original online functions, and deploy a meta-algorithm over the emphlinearized losses.
Our strategy inherits the theoretical guarantee of any expert designed for strongly convex functions and exponentially concave functions.
- Score: 97.64589722655612
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, several universal methods have been proposed for online convex
optimization, and attain minimax rates for multiple types of convex functions
simultaneously. However, they need to design and optimize one surrogate loss
for each type of functions, which makes it difficult to exploit the structure
of the problem and utilize the vast amount of existing algorithms. In this
paper, we propose a simple strategy for universal online convex optimization,
which avoids these limitations. The key idea is to construct a set of experts
to process the original online functions, and deploy a meta-algorithm over the
\emph{linearized} losses to aggregate predictions from experts. Specifically,
we choose Adapt-ML-Prod to track the best expert, because it has a second-order
bound and can be used to leverage strong convexity and exponential concavity.
In this way, we can plug in off-the-shelf online solvers as black-box experts
to deliver problem-dependent regret bounds. Furthermore, our strategy inherits
the theoretical guarantee of any expert designed for strongly convex functions
and exponentially concave functions, up to a double logarithmic factor. For
general convex functions, it maintains the minimax optimality and also achieves
a small-loss bound.
Related papers
- Universal Online Convex Optimization with $1$ Projection per Round [31.16522982351235]
We develop universal algorithms that simultaneously attain minimax rates for multiple types of convex functions.
We employ a surrogate loss defined over simpler domains to develop universal OCO algorithms that only require $1$ projection.
Our analysis sheds new light on the surrogate loss, facilitating rigorous examination of the discrepancy between the regret of the original loss and that of the surrogate loss.
arXiv Detail & Related papers (2024-05-30T05:29:40Z) - Bridging the Gap Between General and Down-Closed Convex Sets in
Submodular Maximization [8.225819874406238]
We show that Mualem citemualem23re shows that this approach cannot smooth between down- and non-down-closed constraints.
In this work, we suggest novel offline and online algorithms based on a natural decomposition of the body into two distinct convex bodies.
We also empirically demonstrate the superiority of our proposed algorithms across three offline and two online applications.
arXiv Detail & Related papers (2024-01-17T14:56:42Z) - Universal Online Learning with Gradient Variations: A Multi-layer Online Ensemble Approach [57.92727189589498]
We propose an online convex optimization approach with two different levels of adaptivity.
We obtain $mathcalO(log V_T)$, $mathcalO(d log V_T)$ and $hatmathcalO(sqrtV_T)$ regret bounds for strongly convex, exp-concave and convex loss functions.
arXiv Detail & Related papers (2023-07-17T09:55:35Z) - Linearization Algorithms for Fully Composite Optimization [61.20539085730636]
This paper studies first-order algorithms for solving fully composite optimization problems convex compact sets.
We leverage the structure of the objective by handling differentiable and non-differentiable separately, linearizing only the smooth parts.
arXiv Detail & Related papers (2023-02-24T18:41:48Z) - Efficient First-order Methods for Convex Optimization with Strongly
Convex Function Constraints [3.667453772837954]
We show how to minimize a convex function subject to strongly convex function constraints.
We identify the sparsity pattern within a finite number result that appears to have independent significance.
arXiv Detail & Related papers (2022-12-21T16:04:53Z) - On a class of geodesically convex optimization problems solved via
Euclidean MM methods [50.428784381385164]
We show how a difference of Euclidean convexization functions can be written as a difference of different types of problems in statistics and machine learning.
Ultimately, we helps the broader broader the broader the broader the broader the work.
arXiv Detail & Related papers (2022-06-22T23:57:40Z) - A Constrained Optimization Approach to Bilevel Optimization with
Multiple Inner Minima [49.320758794766185]
We propose a new approach, which convert the bilevel problem to an equivalent constrained optimization, and then the primal-dual algorithm can be used to solve the problem.
Such an approach enjoys a few advantages including (a) addresses the multiple inner minima challenge; (b) fully first-order efficiency without Jacobian computations.
arXiv Detail & Related papers (2022-03-01T18:20:01Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Extracting Optimal Solution Manifolds using Constrained Neural
Optimization [6.800113407368289]
Constrained Optimization solution algorithms are restricted to point based solutions.
We present an approach for extracting optimal sets as approximate, where unmodified non-informed constraints are defined.
arXiv Detail & Related papers (2020-09-13T15:37:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.