On the Complexity of Finding Small Subgradients in Nonsmooth
Optimization
- URL: http://arxiv.org/abs/2209.10346v1
- Date: Wed, 21 Sep 2022 13:30:00 GMT
- Title: On the Complexity of Finding Small Subgradients in Nonsmooth
Optimization
- Authors: Guy Kornowski, Ohad Shamir
- Abstract summary: We show that no dimension-free rate can be achieved by a deterministic algorithm.
We show how the convergence rate of finding $(delta,epsilon)$-stationary points can be improved in case the function is convex.
- Score: 31.714928102950584
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the oracle complexity of producing $(\delta,\epsilon)$-stationary
points of Lipschitz functions, in the sense proposed by Zhang et al. [2020].
While there exist dimension-free randomized algorithms for producing such
points within $\widetilde{O}(1/\delta\epsilon^3)$ first-order oracle calls, we
show that no dimension-free rate can be achieved by a deterministic algorithm.
On the other hand, we point out that this rate can be derandomized for smooth
functions with merely a logarithmic dependence on the smoothness parameter.
Moreover, we establish several lower bounds for this task which hold for any
randomized algorithm, with or without convexity. Finally, we show how the
convergence rate of finding $(\delta,\epsilon)$-stationary points can be
improved in case the function is convex, a setting which we motivate by proving
that in general no finite time algorithm can produce points with small
subgradients even for convex functions.
Related papers
- Stochastic Zeroth-Order Optimization under Strongly Convexity and Lipschitz Hessian: Minimax Sample Complexity [59.75300530380427]
We consider the problem of optimizing second-order smooth and strongly convex functions where the algorithm is only accessible to noisy evaluations of the objective function it queries.
We provide the first tight characterization for the rate of the minimax simple regret by developing matching upper and lower bounds.
arXiv Detail & Related papers (2024-06-28T02:56:22Z) - An Algorithm with Optimal Dimension-Dependence for Zero-Order Nonsmooth Nonconvex Stochastic Optimization [37.300102993926046]
We study the complexity of producing neither smooth nor convex points of Lipschitz objectives which are possibly using only zero-order evaluations.
Our analysis is based on a simple yet powerful.
Goldstein-subdifferential set, which allows recent advancements in.
nonsmooth non optimization.
arXiv Detail & Related papers (2023-07-10T11:56:04Z) - Gradient-free optimization of highly smooth functions: improved analysis
and a new algorithm [87.22224691317766]
This work studies problems with zero-order noisy oracle information under the assumption that the objective function is highly smooth.
We consider two kinds of zero-order projected gradient descent algorithms.
arXiv Detail & Related papers (2023-06-03T17:05:13Z) - Deterministic Nonsmooth Nonconvex Optimization [94.01526844386977]
We show that randomization is necessary to obtain a dimension-free dimension-free algorithm.
Our algorithm yields the first deterministic dimension-free algorithm for optimizing ReLU networks.
arXiv Detail & Related papers (2023-02-16T13:57:19Z) - A Projection-free Algorithm for Constrained Stochastic Multi-level
Composition Optimization [12.096252285460814]
We propose a projection-free conditional gradient-type algorithm for composition optimization.
We show that the number of oracles and the linear-minimization oracle required by the proposed algorithm, are of order $mathcalO_T(epsilon-2)$ and $mathcalO_T(epsilon-3)$ respectively.
arXiv Detail & Related papers (2022-02-09T06:05:38Z) - Complexity of Inexact Proximal Point Algorithm for minimizing convex functions with Holderian Growth [1.9643748953805935]
We derive nonasymptotic complexity of exact and inexact PPA to minimize convex functions under $gamma-$Holderian growth.
Our numerical tests show improvements over existing restarting versions of the Subgradient Method.
arXiv Detail & Related papers (2021-08-10T07:15:07Z) - Oracle Complexity in Nonsmooth Nonconvex Optimization [49.088972349825085]
It is well-known that given a smooth, bounded-from-below $$stationary points, Oracle-based methods can find smooth approximation of smoothness.
In this paper, we prove an inherent trade-off between optimization and smoothing dimension.
arXiv Detail & Related papers (2021-04-14T10:42:45Z) - Exploiting Higher Order Smoothness in Derivative-free Optimization and
Continuous Bandits [99.70167985955352]
We study the problem of zero-order optimization of a strongly convex function.
We consider a randomized approximation of the projected gradient descent algorithm.
Our results imply that the zero-order algorithm is nearly optimal in terms of sample complexity and the problem parameters.
arXiv Detail & Related papers (2020-06-14T10:42:23Z) - Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions [84.49087114959872]
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonsmooth functions.
In particular, we study Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions.
arXiv Detail & Related papers (2020-02-10T23:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.