Quantum Lower Bounds for Finding Stationary Points of Nonconvex
Functions
- URL: http://arxiv.org/abs/2212.03906v2
- Date: Sun, 4 Jun 2023 18:33:54 GMT
- Title: Quantum Lower Bounds for Finding Stationary Points of Nonconvex
Functions
- Authors: Chenyi Zhang, Tongyang Li
- Abstract summary: Despite progress in classical lower bounds for non optimization, these bounds are still widely open.
Omegabig(frac-1+ppp)$ regarding the first setting.
Omegano()$ regarding the second setting.
Omega()$ if gradient function is meand smooth.
Omega()$ if gradient function is meand smooth.
Omega()$ if gradient function is meand smooth.
Omega()$ if gradient function is meand smooth.
- Score: 8.495749905129278
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Quantum algorithms for optimization problems are of general interest. Despite
recent progress in classical lower bounds for nonconvex optimization under
different settings and quantum lower bounds for convex optimization, quantum
lower bounds for nonconvex optimization are still widely open. In this paper,
we conduct a systematic study of quantum query lower bounds on finding
$\epsilon$-approximate stationary points of nonconvex functions, and we
consider the following two important settings: 1) having access to $p$-th order
derivatives; or 2) having access to stochastic gradients. The classical query
lower bounds is $\Omega\big(\epsilon^{-\frac{1+p}{p}}\big)$ regarding the first
setting, and $\Omega(\epsilon^{-4})$ regarding the second setting (or
$\Omega(\epsilon^{-3})$ if the stochastic gradient function is mean-squared
smooth). In this paper, we extend all these classical lower bounds to the
quantum setting. They match the classical algorithmic results respectively,
demonstrating that there is no quantum speedup for finding
$\epsilon$-stationary points of nonconvex functions with $p$-th order
derivative inputs or stochastic gradient inputs, whether with or without the
mean-squared smoothness assumption. Technically, our quantum lower bounds are
obtained by showing that the sequential nature of classical hard instances in
all these settings also applies to quantum queries, preventing any quantum
speedup other than revealing information of the stationary points sequentially.
Related papers
- Quantum Algorithms for Non-smooth Non-convex Optimization [30.576546266390714]
This paper considers the problem for finding the $(,epsilon)$-Goldstein stationary point of Lipschitz continuous objective.
We construct a zeroth-order quantum estimator for the surrogate oracle function.
arXiv Detail & Related papers (2024-10-21T16:52:26Z) - Towards large-scale quantum optimization solvers with few qubits [59.63282173947468]
We introduce a variational quantum solver for optimizations over $m=mathcalO(nk)$ binary variables using only $n$ qubits, with tunable $k>1$.
We analytically prove that the specific qubit-efficient encoding brings in a super-polynomial mitigation of barren plateaus as a built-in feature.
arXiv Detail & Related papers (2024-01-17T18:59:38Z) - A Newton-CG based barrier-augmented Lagrangian method for general nonconvex conic optimization [53.044526424637866]
In this paper we consider finding an approximate second-order stationary point (SOSP) that minimizes a twice different subject general non conic optimization.
In particular, we propose a Newton-CG based-augmentedconjugate method for finding an approximate SOSP.
arXiv Detail & Related papers (2023-01-10T20:43:29Z) - Robustness of Quantum Algorithms for Nonconvex Optimization [7.191453718557392]
We show that quantum algorithms can find an $epsilon$-SOSP with poly-logarithmic,, or exponential number of queries in $d.
We also characterize the domains where quantum algorithms can find an $epsilon$-SOSP with poly-logarithmic,, or exponential number of queries in $d.
arXiv Detail & Related papers (2022-12-05T19:10:32Z) - Quantum Speedups of Optimizing Approximately Convex Functions with
Applications to Logarithmic Regret Stochastic Convex Bandits [8.682187438614296]
A quantum algorithm finds an $x*incal K$ such that $F(x*)-min_xincal K F(x)leqepsilon$ using $tildeO(n3)$ quantum evaluation queries to $F$.
As an application, we give a quantum function algorithm for zeroth-order convex bandits with $tildeO(n5log2 T)$ regret, an exponential speedup in $T$ compared to the classical $
arXiv Detail & Related papers (2022-09-26T03:19:40Z) - Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization [116.89941263390769]
We consider the smooth convex-concave bilinearly-coupled saddle-point problem, $min_mathbfxmax_mathbfyF(mathbfx) + H(mathbfx,mathbfy)$, where one has access to first-order oracles for $F$, $G$ as well as the bilinear coupling function $H$.
We present a emphaccelerated gradient-extragradient (AG-EG) descent-ascent algorithm that combines extragrad
arXiv Detail & Related papers (2022-06-17T06:10:20Z) - The Quantum and Classical Streaming Complexity of Quantum and Classical
Max-Cut [0.07614628596146598]
We investigate the space complexity of two graph streaming problems: Max-Cut and its quantum analogue, Quantum Max-Cut.
Our work resolves the quantum and classical approximability of quantum and classical Max-Cut using $textrmo(n)$ space.
arXiv Detail & Related papers (2022-06-01T03:40:56Z) - No quantum speedup over gradient descent for non-smooth convex
optimization [22.16973542453584]
Black-box access to a (not necessarily smooth) function $f:mathbbRn to mathbbR$ and its (sub)gradient.
Our goal is to find an $epsilon$-approximate minimum of $f$ starting from a point that is distance at most $R$ from the true minimum.
We show that although the function family used in the lower bound is hard for randomized algorithms, it can be solved using $O(GR/epsilon)$ quantum queries.
arXiv Detail & Related papers (2020-10-05T06:32:47Z) - Second-Order Information in Non-Convex Stochastic Optimization: Power
and Limitations [54.42518331209581]
We find an algorithm which finds.
epsilon$-approximate stationary point (with $|nabla F(x)|le epsilon$) using.
$(epsilon,gamma)$surimate random random points.
Our lower bounds here are novel even in the noiseless case.
arXiv Detail & Related papers (2020-06-24T04:41:43Z) - Quantum Gram-Schmidt Processes and Their Application to Efficient State
Read-out for Quantum Algorithms [87.04438831673063]
We present an efficient read-out protocol that yields the classical vector form of the generated state.
Our protocol suits the case that the output state lies in the row space of the input matrix.
One of our technical tools is an efficient quantum algorithm for performing the Gram-Schmidt orthonormal procedure.
arXiv Detail & Related papers (2020-04-14T11:05:26Z) - Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions [84.49087114959872]
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonsmooth functions.
In particular, we study Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions.
arXiv Detail & Related papers (2020-02-10T23:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.