Quantum speedups of some general-purpose numerical optimisation
algorithms
- URL: http://arxiv.org/abs/2004.06521v1
- Date: Tue, 14 Apr 2020 14:04:47 GMT
- Title: Quantum speedups of some general-purpose numerical optimisation
algorithms
- Authors: Cezar-Mihail Alexandru, Ella Bridgett-Tomkinson, Noah Linden, Joseph
MacManus, Ashley Montanaro and Hannah Morris
- Abstract summary: We show that many techniques for global optimisation under a Lipschitz constraint can be accelerated near-quadratically.
Second, we show that backtracking line search, an ingredient in quasi-Newton optimisation algorithms, can be accelerated up to quadratically.
Third, we show that a component of the Nelder-Mead algorithm can be accelerated by up to a multiplicative factor of $O(sqrtn)$.
- Score: 0.03078691410268859
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We give quantum speedups of several general-purpose numerical optimisation
methods for minimising a function $f:\mathbb{R}^n \to \mathbb{R}$. First, we
show that many techniques for global optimisation under a Lipschitz constraint
can be accelerated near-quadratically. Second, we show that backtracking line
search, an ingredient in quasi-Newton optimisation algorithms, can be
accelerated up to quadratically. Third, we show that a component of the
Nelder-Mead algorithm can be accelerated by up to a multiplicative factor of
$O(\sqrt{n})$. Fourth, we show that a quantum gradient computation algorithm of
Gily\'en et al. can be used to approximately compute gradients in the framework
of stochastic gradient descent. In each case, our results are based on applying
existing quantum algorithms to accelerate specific components of the classical
algorithms, rather than developing new quantum techniques.
Related papers
- Denoising Gradient Descent in Variational Quantum Algorithms [0.0]
We introduce an algorithm for mitigating the adverse effects of noise on gradient descent in variational quantum algorithms.
We empirically demonstrate the advantages offered by our algorithm on randomized parametrized quantum circuits.
arXiv Detail & Related papers (2024-03-06T16:15:25Z) - Pure Quantum Gradient Descent Algorithm and Full Quantum Variational
Eigensolver [0.7149735232319818]
gradient-based gradient descent algorithm is a widely adopted optimization method.
We propose a novel quantum-based gradient calculation method that requires only a single oracle calculation.
We successfully implemented the quantum gradient descent algorithm and applied it to the Variational Quantum Eigensolver (VQE)
arXiv Detail & Related papers (2023-05-07T05:52:41Z) - Accelerated First-Order Optimization under Nonlinear Constraints [73.2273449996098]
We exploit between first-order algorithms for constrained optimization and non-smooth systems to design a new class of accelerated first-order algorithms.
An important property of these algorithms is that constraints are expressed in terms of velocities instead of sparse variables.
arXiv Detail & Related papers (2023-02-01T08:50:48Z) - Fast Computation of Optimal Transport via Entropy-Regularized Extragradient Methods [75.34939761152587]
Efficient computation of the optimal transport distance between two distributions serves as an algorithm that empowers various applications.
This paper develops a scalable first-order optimization-based method that computes optimal transport to within $varepsilon$ additive accuracy.
arXiv Detail & Related papers (2023-01-30T15:46:39Z) - Quantum speedup of leverage score sampling and its application [0.0]
In this paper, we propose a quantum algorithm to accelerate the computation of leverage scores.
As an application, we propose a new quantum algorithm for rigid regression problems with vector solution outputs.
arXiv Detail & Related papers (2023-01-15T14:40:18Z) - Gradient-Free optimization algorithm for single-qubit quantum classifier [0.3314882635954752]
A gradient-free optimization algorithm is proposed to overcome the effects of barren plateau caused by quantum devices.
The proposed algorithm is demonstrated for a classification task and is compared with that using Adam.
The proposed gradient-free optimization algorithm can reach a high accuracy faster than that using Adam.
arXiv Detail & Related papers (2022-05-10T08:45:03Z) - Quantum algorithm for stochastic optimal stopping problems with
applications in finance [60.54699116238087]
The famous least squares Monte Carlo (LSM) algorithm combines linear least square regression with Monte Carlo simulation to approximately solve problems in optimal stopping theory.
We propose a quantum LSM based on quantum access to a process, on quantum circuits for computing the optimal stopping times, and on quantum techniques for Monte Carlo.
arXiv Detail & Related papers (2021-11-30T12:21:41Z) - Quadratic Unconstrained Binary Optimisation via Quantum-Inspired
Annealing [58.720142291102135]
We present a classical algorithm to find approximate solutions to instances of quadratic unconstrained binary optimisation.
We benchmark our approach for large scale problem instances with tuneable hardness and planted solutions.
arXiv Detail & Related papers (2021-08-18T09:26:17Z) - Variational Quantum Optimization with Multi-Basis Encodings [62.72309460291971]
We introduce a new variational quantum algorithm that benefits from two innovations: multi-basis graph complexity and nonlinear activation functions.
Our results in increased optimization performance, two increase in effective landscapes and a reduction in measurement progress.
arXiv Detail & Related papers (2021-06-24T20:16:02Z) - Provably Faster Algorithms for Bilevel Optimization [54.83583213812667]
Bilevel optimization has been widely applied in many important machine learning applications.
We propose two new algorithms for bilevel optimization.
We show that both algorithms achieve the complexity of $mathcalO(epsilon-1.5)$, which outperforms all existing algorithms by the order of magnitude.
arXiv Detail & Related papers (2021-06-08T21:05:30Z) - Quantum Gradient Algorithm for General Polynomials [5.008814514502094]
Gradient-based algorithms, popular strategies to optimization problems, are essential for many modern machine-learning techniques.
We propose a quantum gradient algorithm for optimizing numericals with the dressed amplitude.
For the potential values in high-dimension optimizations, this quantum algorithm is supposed to facilitate gradient-optimizations.
arXiv Detail & Related papers (2020-04-23T11:28:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.