Minimisation of Polyak-Łojasewicz Functions Using Random Zeroth-Order Oracles
- URL: http://arxiv.org/abs/2405.09106v1
- Date: Wed, 15 May 2024 05:43:16 GMT
- Title: Minimisation of Polyak-Łojasewicz Functions Using Random Zeroth-Order Oracles
- Authors: Amir Ali Farzin, Iman Shames,
- Abstract summary: The framework is based on exploiting a random oracle to estimate the gradient function.
The convergence of the algorithm to a global minimum in the unconstrained case and to a neighbourhood of the global minimum in the constrained case is presented.
- Score: 1.5101132008238316
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: The application of a zeroth-order scheme for minimising Polyak-\L{}ojasewicz (PL) functions is considered. The framework is based on exploiting a random oracle to estimate the function gradient. The convergence of the algorithm to a global minimum in the unconstrained case and to a neighbourhood of the global minimum in the constrained case along with their corresponding complexity bounds are presented. The theoretical results are demonstrated via numerical examples.
Related papers
- Stochastic Zeroth-Order Optimization under Strongly Convexity and Lipschitz Hessian: Minimax Sample Complexity [59.75300530380427]
We consider the problem of optimizing second-order smooth and strongly convex functions where the algorithm is only accessible to noisy evaluations of the objective function it queries.
We provide the first tight characterization for the rate of the minimax simple regret by developing matching upper and lower bounds.
arXiv Detail & Related papers (2024-06-28T02:56:22Z) - A Unified Analysis on the Subgradient Upper Bounds for the Subgradient Methods Minimizing Composite Nonconvex, Nonsmooth and Non-Lipschitz Functions [7.972544890243396]
This paper presents a unified analysis for the proximal subgradient method (Prox-SubGrad) type approach.
We are able to relate error-bound conditions, the growth conditions of the subgradients of the objective, and the behavior of the proximal subgradient iterates on some remarkably broad classes of objective functions.
arXiv Detail & Related papers (2023-08-30T23:34:11Z) - Stochastic Inexact Augmented Lagrangian Method for Nonconvex Expectation
Constrained Optimization [88.0031283949404]
Many real-world problems have complicated non functional constraints and use a large number of data points.
Our proposed method outperforms an existing method with the previously best-known result.
arXiv Detail & Related papers (2022-12-19T14:48:54Z) - Optimal Algorithms for Stochastic Complementary Composite Minimization [55.26935605535377]
Inspired by regularization techniques in statistics and machine learning, we study complementary composite minimization.
We provide novel excess risk bounds, both in expectation and with high probability.
Our algorithms are nearly optimal, which we prove via novel lower complexity bounds for this class of problems.
arXiv Detail & Related papers (2022-11-03T12:40:24Z) - On efficient algorithms for computing near-best polynomial
approximations to high-dimensional, Hilbert-valued functions from limited
samples [1.0650780147044159]
Sparse approximation has become indispensable for approxing smooth, high-dimensional functions from limited samples.
This paper introduces algorithms and theoretical guarantees that assert exponential or algebraic rates of convergence, along with robustness to sampling, algorithmic and physical discretization errors.
arXiv Detail & Related papers (2022-03-25T20:56:07Z) - Complexity of Inexact Proximal Point Algorithm for minimizing convex functions with Holderian Growth [1.9643748953805935]
We derive nonasymptotic complexity of exact and inexact PPA to minimize convex functions under $gamma-$Holderian growth.
Our numerical tests show improvements over existing restarting versions of the Subgradient Method.
arXiv Detail & Related papers (2021-08-10T07:15:07Z) - Finding Global Minima via Kernel Approximations [90.42048080064849]
We consider the global minimization of smooth functions based solely on function evaluations.
In this paper, we consider an approach that jointly models the function to approximate and finds a global minimum.
arXiv Detail & Related papers (2020-12-22T12:59:30Z) - Conditional gradient methods for stochastically constrained convex
minimization [54.53786593679331]
We propose two novel conditional gradient-based methods for solving structured convex optimization problems.
The most important feature of our framework is that only a subset of the constraints is processed at each iteration.
Our algorithms rely on variance reduction and smoothing used in conjunction with conditional gradient steps, and are accompanied by rigorous convergence guarantees.
arXiv Detail & Related papers (2020-07-07T21:26:35Z) - Exploiting Higher Order Smoothness in Derivative-free Optimization and
Continuous Bandits [99.70167985955352]
We study the problem of zero-order optimization of a strongly convex function.
We consider a randomized approximation of the projected gradient descent algorithm.
Our results imply that the zero-order algorithm is nearly optimal in terms of sample complexity and the problem parameters.
arXiv Detail & Related papers (2020-06-14T10:42:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.