Cumulative Regret Analysis of the Piyavskii--Shubert Algorithm and Its
Variants for Global Optimization
- URL: http://arxiv.org/abs/2108.10859v2
- Date: Thu, 28 Dec 2023 16:37:20 GMT
- Title: Cumulative Regret Analysis of the Piyavskii--Shubert Algorithm and Its
Variants for Global Optimization
- Authors: Kaan Gokcesu, Hakan Gokcesu
- Abstract summary: We study the problem of global optimization, where we analyze the performance of the Piyavskii--Shubert algorithm and its variants.
We show that, our algorithm efficiently determines its queries; and achieves nearly minimax optimal (up to log factors) cumulative regret.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of global optimization, where we analyze the performance
of the Piyavskii--Shubert algorithm and its variants. For any given time
duration $T$, instead of the extensively studied simple regret (which is the
difference of the losses between the best estimate up to $T$ and the global
minimum), we study the cumulative regret up to time $T$. For $L$-Lipschitz
continuous functions, we show that the cumulative regret is $O(L\log T)$. For
$H$-Lipschitz smooth functions, we show that the cumulative regret is $O(H)$.
We analytically extend our results for functions with Holder continuous
derivatives, which cover both the Lipschitz continuous and the Lipschitz smooth
functions, individually. We further show that a simpler variant of the
Piyavskii-Shubert algorithm performs just as well as the traditional variants
for the Lipschitz continuous or the Lipschitz smooth functions. We further
extend our results to broader classes of functions, and show that, our
algorithm efficiently determines its queries; and achieves nearly minimax
optimal (up to log factors) cumulative regret, for general convex or even
concave regularity conditions on the extrema of the objective (which
encompasses many preceding regularities). We consider further extensions by
investigating the performance of the Piyavskii-Shubert variants in the
scenarios with unknown regularity, noisy evaluation and multivariate domain.
Related papers
- Stochastic Zeroth-Order Optimization under Strongly Convexity and Lipschitz Hessian: Minimax Sample Complexity [59.75300530380427]
We consider the problem of optimizing second-order smooth and strongly convex functions where the algorithm is only accessible to noisy evaluations of the objective function it queries.
We provide the first tight characterization for the rate of the minimax simple regret by developing matching upper and lower bounds.
arXiv Detail & Related papers (2024-06-28T02:56:22Z) - Gradient-free optimization of highly smooth functions: improved analysis
and a new algorithm [87.22224691317766]
This work studies problems with zero-order noisy oracle information under the assumption that the objective function is highly smooth.
We consider two kinds of zero-order projected gradient descent algorithms.
arXiv Detail & Related papers (2023-06-03T17:05:13Z) - On Convergence of Incremental Gradient for Non-Convex Smooth Functions [63.51187646914962]
In machine learning and network optimization, algorithms like shuffle SGD are popular due to minimizing the number of misses and good cache.
This paper delves into the convergence properties SGD algorithms with arbitrary data ordering.
arXiv Detail & Related papers (2023-05-30T17:47:27Z) - Efficient Lipschitzian Global Optimization of H\"older Continuous
Multivariate Functions [0.0]
This study presents an effective global optimization technique designed for multivariate functions that are H"older continuous.
We show that the algorithm attains an average regret bound of $O(T-fracalphan)$ for optimizing a H"older continuous target function with H"older $alpha$ in an $n$-dimensional space within a given time horizon.
arXiv Detail & Related papers (2023-03-24T22:29:35Z) - A Near-Optimal Algorithm for Univariate Zeroth-Order Budget Convex
Optimization [4.608510640547952]
We prove near-optimal optimization error guarantees for Dy Search.
We show that the classical dependence on the global Lipschitz constant in the error bounds is an artifact of the granularity of the budget.
arXiv Detail & Related papers (2022-08-13T19:57:04Z) - Efficient Minimax Optimal Global Optimization of Lipschitz Continuous
Multivariate Functions [0.0]
We show that our algorithm achieves an average regretO(LstnT-frac1n)$ for the horizon for the Lipschitz continuous functions.
arXiv Detail & Related papers (2022-06-06T06:28:38Z) - Low Regret Binary Sampling Method for Efficient Global Optimization of
Univariate Functions [0.0]
We study the cumulative regret of the algorithm instead of the simple regret between our best query and the optimal value of the objective function.
Although our approach has similar regret results with the traditional lower-bounding algorithms, it has a major computational cost advantage.
arXiv Detail & Related papers (2022-01-18T18:11:48Z) - Finding Global Minima via Kernel Approximations [90.42048080064849]
We consider the global minimization of smooth functions based solely on function evaluations.
In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum.
arXiv Detail & Related papers (2020-12-22T12:59:30Z) - Exploiting Higher Order Smoothness in Derivative-free Optimization and
Continuous Bandits [99.70167985955352]
We study the problem of zero-order optimization of a strongly convex function.
We consider a randomized approximation of the projected gradient descent algorithm.
Our results imply that the zero-order algorithm is nearly optimal in terms of sample complexity and the problem parameters.
arXiv Detail & Related papers (2020-06-14T10:42:23Z) - SGD with shuffling: optimal rates without component convexity and large
epoch requirements [60.65928290219793]
We consider the RandomShuffle (shuffle at the beginning of each epoch) and SingleShuffle (shuffle only once)
We establish minimax optimal convergence rates of these algorithms up to poly-log factor gaps.
We further sharpen the tight convergence results for RandomShuffle by removing the drawbacks common to all prior arts.
arXiv Detail & Related papers (2020-06-12T05:00:44Z) - Regret analysis of the Piyavskii-Shubert algorithm for global Lipschitz
optimization [1.6822770693792826]
We study a natural algorithm designed originally by Piyavskii and Shubert in 1972.
We prove new bounds on the number of evaluations of the function needed to reach or certify a given optimization accuracy.
arXiv Detail & Related papers (2020-02-06T17:31:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.