Quantum Hamiltonian Descent for Non-smooth Optimization
- URL: http://arxiv.org/abs/2503.15878v1
- Date: Thu, 20 Mar 2025 06:02:33 GMT
- Title: Quantum Hamiltonian Descent for Non-smooth Optimization
- Authors: Jiaqi Leng, Yufan Zheng, Zhiyuan Jia, Chaoyue Zhao, Yuxiang Peng, Xiaodi Wu,
- Abstract summary: We investigate how quantum mechanics can be leveraged to overcome limitations of classical algorithms.<n>We propose a global convergence rate for non-smoothtime global convergence problems through a novel design.<n>In addition, we propose discrete-time QHD fully digitized via a novel Lynov function.
- Score: 7.773836776652785
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Non-smooth optimization models play a fundamental role in various disciplines, including engineering, science, management, and finance. However, classical algorithms for solving such models often struggle with convergence speed, scalability, and parameter tuning, particularly in high-dimensional and non-convex settings. In this paper, we explore how quantum mechanics can be leveraged to overcome these limitations. Specifically, we investigate the theoretical properties of the Quantum Hamiltonian Descent (QHD) algorithm for non-smooth optimization in both continuous and discrete time. First, we propose continuous-time variants of the general QHD algorithm and establish their global convergence and convergence rate for non-smooth convex and strongly convex problems through a novel Lyapunov function design. Furthermore, we prove the finite-time global convergence of continuous-time QHD for non-smooth non-convex problems under mild conditions (i.e., locally Lipschitz). In addition, we propose discrete-time QHD, a fully digitized implementation of QHD via operator splitting (i.e., product formula). We find that discrete-time QHD exhibits similar convergence properties even with large time steps. Finally, numerical experiments validate our theoretical findings and demonstrate the computational advantages of QHD over classical non-smooth non-convex optimization algorithms.
Related papers
- A Truncated Newton Method for Optimal Transport [13.848861021326755]
We introduce a specialized truncated Newton algorithm for entropic-regularized optimal transport (OT) solvers.
Our algorithm exhibits exceptionally favorable runtime performance, achieving high precision orders of magnitude faster than many existing alternatives.
The scalability of the algorithm is showcased on an extremely large OT problem with $n approx 106$, solved approximately under weak entopric regularization.
arXiv Detail & Related papers (2025-04-02T19:00:24Z) - On Speedups for Convex Optimization via Quantum Dynamics [2.5094874597551913]
We explore the potential for quantum speed in convex optimization using discrete simulations of the Quantum Hamiltonian Descent framework.
In continuous time, we demonstrate that QHD, with suitable parameters, can achieve arbitrarily fast convergence rates.
We show that QHD offers a super-quadratic query advantage over all known classical algorithms tolerating this level of evaluation noise.
arXiv Detail & Related papers (2025-03-31T17:21:12Z) - Quantum Langevin Dynamics for Optimization [14.447963674485132]
We utilize Quantum Langevin Dynamics (QLD) to solve optimization problems.<n>Specifically, we examine the dynamics of a system coupled with an infinite heat bath.<n>We demonstrate that the average energy of the system can approach zero in the low temperature limit.
arXiv Detail & Related papers (2023-11-27T07:25:47Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Quantum Hamiltonian Descent [8.580250279996985]
Quantum Hamiltonian Descent (QHD) is a truly quantum counterpart of gradient descent algorithms.
QHD is described as a Hamiltonian evolution simulatable on both digital and analog quantum computers.
arXiv Detail & Related papers (2023-03-02T18:34:38Z) - PAPAL: A Provable PArticle-based Primal-Dual ALgorithm for Mixed Nash Equilibrium [58.26573117273626]
We consider the non-AL equilibrium nonconptotic objective function in two-player zero-sum continuous games.
Our novel insights into the particle-based algorithms for continuous distribution strategies are presented.
arXiv Detail & Related papers (2023-03-02T05:08:15Z) - Complexity-Theoretic Limitations on Quantum Algorithms for Topological
Data Analysis [59.545114016224254]
Quantum algorithms for topological data analysis seem to provide an exponential advantage over the best classical approach.
We show that the central task of TDA -- estimating Betti numbers -- is intractable even for quantum computers.
We argue that an exponential quantum advantage can be recovered if the input data is given as a specification of simplices.
arXiv Detail & Related papers (2022-09-28T17:53:25Z) - On constant-time quantum annealing and guaranteed approximations for
graph optimization problems [0.0]
Quantum Annealing (QA) is a computational framework where a quantum system's continuous evolution is used to find the global minimum of an objective function.
In this paper we use a technique borrowed from theoretical physics, the Lieb-Robinson (LR) bound, and develop new tools proving that short, constant time quantum annealing guarantees constant factor approximations ratios.
Our results are of similar flavor to the well-known ones obtained in the different but related QAOA (quantum optimization algorithms) framework.
arXiv Detail & Related papers (2022-02-03T15:23:18Z) - Accelerating variational quantum algorithms with multiple quantum
processors [78.36566711543476]
Variational quantum algorithms (VQAs) have the potential of utilizing near-term quantum machines to gain certain computational advantages.
Modern VQAs suffer from cumbersome computational overhead, hampered by the tradition of employing a solitary quantum processor to handle large data.
Here we devise an efficient distributed optimization scheme, called QUDIO, to address this issue.
arXiv Detail & Related papers (2021-06-24T08:18:42Z) - Recent Theoretical Advances in Non-Convex Optimization [56.88981258425256]
Motivated by recent increased interest in analysis of optimization algorithms for non- optimization in deep networks and other problems in data, we give an overview of recent results of theoretical optimization algorithms for non- optimization.
arXiv Detail & Related papers (2020-12-11T08:28:51Z) - Fast Objective & Duality Gap Convergence for Non-Convex Strongly-Concave
Min-Max Problems with PL Condition [52.08417569774822]
This paper focuses on methods for solving smooth non-concave min-max problems, which have received increasing attention due to deep learning (e.g., deep AUC)
arXiv Detail & Related papers (2020-06-12T00:32:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.