Intuitive Analysis of the Quantization-based Optimization: From Stochastic and Quantum Mechanical Perspective
- URL: http://arxiv.org/abs/2501.00436v1
- Date: Tue, 31 Dec 2024 13:38:30 GMT
- Title: Intuitive Analysis of the Quantization-based Optimization: From Stochastic and Quantum Mechanical Perspective
- Authors: Jinwuk Seok, Changsik Cho,
- Abstract summary: Quantization of an objective function is an effective optimization methodology.<n>We present an intuitive analysis of the technique based on the quantization of an objective function.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this paper, we present an intuitive analysis of the optimization technique based on the quantization of an objective function. Quantization of an objective function is an effective optimization methodology that decreases the measure of a level set containing several saddle points and local minima and finds the optimal point at the limit level set. To investigate the dynamics of quantization-based optimization, we derive an overdamped Langevin dynamics model from an intuitive analysis to minimize the level set by iterative quantization. We claim that quantization-based optimization involves the quantities of thermodynamical and quantum mechanical optimization as the core methodologies of global optimization. Furthermore, on the basis of the proposed SDE, we provide thermodynamic and quantum mechanical analysis with Witten-Laplacian. The simulation results with the benchmark functions, which compare the performance of the nonlinear optimization, demonstrate the validity of the quantization-based optimization.
Related papers
- A coherent approach to quantum-classical optimization [0.0]
Hybrid quantum-classical optimization techniques have been shown to allow for the reduction of quantum computational resources.
We identify the coherence entropy as a crucial metric in determining the suitability of quantum states.
We propose a quantum-classical optimization protocol that significantly improves on previous approaches for such tasks.
arXiv Detail & Related papers (2024-09-20T22:22:53Z) - Quantization-based Optimization with Perspective of Quantum Mechanics [0.0]
We provide the analysis for quantization-based optimization based on the Schr"odinger equation.
We show that the tunneling effect derived by the Schr"odinger equation in quantization-based optimization enables to escape of a local minimum.
arXiv Detail & Related papers (2023-08-20T05:03:31Z) - Tensor train optimization of parametrized quantum circuits [0.0]
We consider parametrized quantum circuits composed of a low-depth hardware-efficient ansatz and Hamiltonian variational ansatz.
We discuss on the advantage of using tensor train based optimization, especially in the presence of noise.
arXiv Detail & Related papers (2023-06-03T06:50:00Z) - Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization [108.35402316802765]
We propose a new first-order optimization algorithm -- AcceleratedGradient-OptimisticGradient (AG-OG) Ascent.
We show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings.
We extend our algorithm to extend the setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings.
arXiv Detail & Related papers (2022-10-31T17:59:29Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Principal Component Analysis Applied to Gradient Fields in Band Gap
Optimization Problems for Metamaterials [0.7699714865575189]
This article describes the application of a related unsupervised machine learning technique, namely, principal component analysis, to approximate the gradient of the objective function of a band gap optimization problem for an acoustic metamaterial.
Numerical results show the effectiveness of the proposed method.
arXiv Detail & Related papers (2021-04-04T11:13:37Z) - Quantum variational optimization: The role of entanglement and problem
hardness [0.0]
We study the role of entanglement, the structure of the variational quantum circuit, and the structure of the optimization problem.
Our numerical results indicate an advantage in adapting the distribution of entangling gates to the problem's topology.
We find evidence that applying conditional value at risk type cost functions improves the optimization, increasing the probability of overlap with the optimal solutions.
arXiv Detail & Related papers (2021-03-26T14:06:54Z) - Benchmarking adaptive variational quantum eigensolvers [63.277656713454284]
We benchmark the accuracy of VQE and ADAPT-VQE to calculate the electronic ground states and potential energy curves.
We find both methods provide good estimates of the energy and ground state.
gradient-based optimization is more economical and delivers superior performance than analogous simulations carried out with gradient-frees.
arXiv Detail & Related papers (2020-11-02T19:52:04Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - On the Global Optimality of Model-Agnostic Meta-Learning [133.16370011229776]
Model-a meta-learning (MAML) formulates meta-learning as a bilevel optimization problem, where the inner level solves each subtask based on a shared prior.
We characterize optimality of the stationary points attained by MAML for both learning and supervised learning, where the inner-level outer-level problems are solved via first-order optimization methods.
arXiv Detail & Related papers (2020-06-23T17:33:14Z) - Optimization with Momentum: Dynamical, Control-Theoretic, and Symplectic
Perspectives [97.16266088683061]
The article rigorously establishes why symplectic discretization schemes are important for momentum-based optimization algorithms.
It provides a characterization of algorithms that exhibit accelerated convergence.
arXiv Detail & Related papers (2020-02-28T00:32:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.