Efficient Projection-Free Online Convex Optimization with Membership
Oracle
- URL: http://arxiv.org/abs/2111.05818v1
- Date: Wed, 10 Nov 2021 17:22:29 GMT
- Title: Efficient Projection-Free Online Convex Optimization with Membership
Oracle
- Authors: Zakaria Mhammedi
- Abstract summary: We present a new reduction that turns any algorithm A defined on a Euclidean ball to an algorithm on a constrained set C contained within the ball.
Our reduction requires O(T log T) calls to a Membership Oracle on C after T rounds, and no linear optimization on C is needed.
- Score: 11.745866777357566
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In constrained convex optimization, existing methods based on the ellipsoid
or cutting plane method do not scale well with the dimension of the ambient
space. Alternative approaches such as Projected Gradient Descent only provide a
computational benefit for simple convex sets such as Euclidean balls, where
Euclidean projections can be performed efficiently. For other sets, the cost of
the projections can be too high. To circumvent these issues, alternative
methods based on the famous Frank-Wolfe algorithm have been studied and used.
Such methods use a Linear Optimization Oracle at each iteration instead of
Euclidean projections; the former can often be performed efficiently. Such
methods have also been extended to the online and stochastic optimization
settings. However, the Frank-Wolfe algorithm and its variants do not achieve
the optimal performance, in terms of regret or rate, for general convex sets.
What is more, the Linear Optimization Oracle they use can still be
computationally expensive in some cases. In this paper, we move away from
Frank-Wolfe style algorithms and present a new reduction that turns any
algorithm A defined on a Euclidean ball (where projections are cheap) to an
algorithm on a constrained set C contained within the ball, without sacrificing
the performance of the original algorithm A by much. Our reduction requires O(T
log T) calls to a Membership Oracle on C after T rounds, and no linear
optimization on C is needed. Using our reduction, we recover optimal regret
bounds [resp. rates], in terms of the number of iterations, in online [resp.
stochastic] convex optimization. Our guarantees are also useful in the offline
convex optimization setting when the dimension of the ambient space is large.
Related papers
- Sparsity-Constraint Optimization via Splicing Iteration [1.3622424109977902]
We develop an algorithm named Sparsity-Constraint Optimization via sPlicing itEration (SCOPE)
SCOPE converges effectively without tuning parameters.
We apply SCOPE to solve quadratic optimization, learn sparse classifiers, and recover sparse Markov networks for binary variables.
Our open-source Python package skscope based on C++ implementation is publicly available on GitHub.
arXiv Detail & Related papers (2024-06-17T18:34:51Z) - Nonsmooth Projection-Free Optimization with Functional Constraints [12.20060970177791]
This paper presents a subgradient-based algorithm for constrained nonsmooth convex computation.
Our proposed algorithm can handle nonsmooth problems with general convex functional inequality constraints.
Similar performance is observed when deterministic subgradients are replaced with subgradients.
arXiv Detail & Related papers (2023-11-18T23:06:33Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - Projection-Free Online Convex Optimization via Efficient Newton
Iterations [10.492474737007722]
This paper presents new projection-free algorithms for Online Convex Optimization (OCO) over a convex domain $mathcalK subset mathbbRd$.
arXiv Detail & Related papers (2023-06-19T18:48:53Z) - Riemannian Projection-free Online Learning [5.918057694291832]
The projection operation is a critical component in a range of optimization algorithms, such as online gradient descent (OGD)
It suffers from computational limitations in high-dimensional settings or when dealing with ill-conditioned constraint sets.
This paper presents methods for obtaining sub-linear regret guarantees in online geodesically convex optimization on curved spaces.
arXiv Detail & Related papers (2023-05-30T18:22:09Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - On Constraints in First-Order Optimization: A View from Non-Smooth
Dynamical Systems [99.59934203759754]
We introduce a class of first-order methods for smooth constrained optimization.
Two distinctive features of our approach are that projections or optimizations over the entire feasible set are avoided.
The resulting algorithmic procedure is simple to implement even when constraints are nonlinear.
arXiv Detail & Related papers (2021-07-17T11:45:13Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.