How to Prove the Optimized Values of Hyperparameters for Particle Swarm
Optimization?
- URL: http://arxiv.org/abs/2302.00155v1
- Date: Wed, 1 Feb 2023 00:33:35 GMT
- Title: How to Prove the Optimized Values of Hyperparameters for Particle Swarm
Optimization?
- Authors: Abel C. H. Chen
- Abstract summary: This study proposes an analytic framework to analyze the optimized average-fitness-function-value (AFFV) based on mathematical models for a variety of fitness functions.
Experimental results show that the hyper parameter values from the proposed method can obtain higher efficiency convergences and lower AFFVs.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, several swarm intelligence optimization algorithms have been
proposed to be applied for solving a variety of optimization problems. However,
the values of several hyperparameters should be determined. For instance,
although Particle Swarm Optimization (PSO) has been applied for several
applications with higher optimization performance, the weights of inertial
velocity, the particle's best known position and the swarm's best known
position should be determined. Therefore, this study proposes an analytic
framework to analyze the optimized average-fitness-function-value (AFFV) based
on mathematical models for a variety of fitness functions. Furthermore, the
optimized hyperparameter values could be determined with a lower AFFV for
minimum cases. Experimental results show that the hyperparameter values from
the proposed method can obtain higher efficiency convergences and lower AFFVs.
Related papers
- Efficient and Robust Parameter Optimization of the Unitary Coupled-Cluster Ansatz [4.607081302947026]
We propose sequential optimization with approximate parabola (SOAP) for parameter optimization of unitary coupled-cluster ansatz on quantum computers.
Numerical benchmark studies on molecular systems demonstrate that SOAP achieves significantly faster convergence and greater robustness to noise.
SOAP is further validated through experiments on a superconducting quantum computer using a 2-qubit model system.
arXiv Detail & Related papers (2024-01-10T03:30:39Z) - Exploring the Optimized Value of Each Hyperparameter in Various Gradient
Descent Algorithms [0.0]
gradient descent algorithms have been applied to the parameter optimization of several deep learning models with higher accuracies or lower errors.
This study proposes an analytical framework for analyzing the mean error of each objective function based on various gradient descent algorithms.
The experimental results show that higher efficiency convergences and lower errors can be obtained by the proposed method.
arXiv Detail & Related papers (2022-12-23T12:04:33Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Non-Convex Optimization with Certificates and Fast Rates Through Kernel
Sums of Squares [68.8204255655161]
We consider potentially non- optimized approximation problems.
In this paper, we propose an algorithm that achieves close to optimal a priori computational guarantees.
arXiv Detail & Related papers (2022-04-11T09:37:04Z) - Hyper-parameter optimization based on soft actor critic and hierarchical
mixture regularization [5.063728016437489]
We model hyper- parameter optimization process as a Markov decision process, and tackle it with reinforcement learning.
A novel hyper- parameter optimization method based on soft actor critic and hierarchical mixture regularization has been proposed.
arXiv Detail & Related papers (2021-12-08T02:34:43Z) - Optimizing Large-Scale Hyperparameters via Automated Learning Algorithm [97.66038345864095]
We propose a new hyperparameter optimization method with zeroth-order hyper-gradients (HOZOG)
Specifically, we first formulate hyperparameter optimization as an A-based constrained optimization problem.
Then, we use the average zeroth-order hyper-gradients to update hyper parameters.
arXiv Detail & Related papers (2021-02-17T21:03:05Z) - Hyper-parameter estimation method with particle swarm optimization [0.8883733362171032]
The PSO method cannot be directly used in the problem of hyper- parameters estimation.
The proposed method uses the swarm method to optimize the performance of the acquisition function.
The results on several problems are improved.
arXiv Detail & Related papers (2020-11-24T07:51:51Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Efficient hyperparameter optimization by way of PAC-Bayes bound
minimization [4.191847852775072]
We present an alternative objective that is equivalent to a Probably Approximately Correct-Bayes (PAC-Bayes) bound on the expected out-of-sample error.
We then devise an efficient gradient-based algorithm to minimize this objective.
arXiv Detail & Related papers (2020-08-14T15:54:51Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.