A Study of the Fundamental Parameters of Particle Swarm Optimizers
- URL: http://arxiv.org/abs/2101.10326v1
- Date: Mon, 25 Jan 2021 01:18:34 GMT
- Title: A Study of the Fundamental Parameters of Particle Swarm Optimizers
- Authors: Mauro S. Innocente, Johann Sienz
- Abstract summary: Population-based algorithms can handle different optimization problems with few or no adaptations.
Their main drawbacks consist of their comparatively higher computational cost and difficulty in handling equality constraints.
This paper deals with the effect of the settings of the parameters of the particles' velocity update equation on the behaviour of the system.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The range of applications of traditional optimization methods are limited by
the features of the object variables, and of both the objective and the
constraint functions. In contrast, population-based algorithms whose
optimization capabilities are emergent properties, such as evolutionary
algorithms and particle swarm optimization, present almost no restriction on
those features and can handle different optimization problems with few or no
adaptations. Their main drawbacks consist of their comparatively higher
computational cost and difficulty in handling equality constraints. The
particle swarm optimization method is sometimes viewed as an evolutionary
algorithm because of their many similarities, despite not being inspired by the
same metaphor: they evolve a population of individuals taking into account
previous experiences and using stochastic operators to introduce new responses.
The advantages of evolutionary algorithms with respect to traditional methods
have been greatly discussed in the literature for decades. While the particle
swarm optimizers share such advantages, their main desirable features when
compared to evolutionary algorithms are their lower computational cost and
easier implementation, involving no operator design and few parameters to be
tuned. However, even slight modifications of these parameters greatly influence
the dynamics of the swarm. This paper deals with the effect of the settings of
the parameters of the particles' velocity update equation on the behaviour of
the system.
Related papers
- Sample-efficient Bayesian Optimisation Using Known Invariances [56.34916328814857]
We show that vanilla and constrained BO algorithms are inefficient when optimising invariant objectives.
We derive a bound on the maximum information gain of these invariant kernels.
We use our method to design a current drive system for a nuclear fusion reactor, finding a high-performance solution.
arXiv Detail & Related papers (2024-10-22T12:51:46Z) - Adaptive Preference Scaling for Reinforcement Learning with Human Feedback [103.36048042664768]
Reinforcement learning from human feedback (RLHF) is a prevalent approach to align AI systems with human values.
We propose a novel adaptive preference loss, underpinned by distributionally robust optimization (DRO)
Our method is versatile and can be readily adapted to various preference optimization frameworks.
arXiv Detail & Related papers (2024-06-04T20:33:22Z) - Frog-Snake prey-predation Relationship Optimization (FSRO) : A novel nature-inspired metaheuristic algorithm for feature selection [0.0]
This study proposes the Frog-Snake prey-predation Relationship Optimization (FSRO) algorithm.
It is inspired by the prey-predation relationship between frogs and snakes for application to discrete optimization problems.
The proposed algorithm conducts computational experiments on feature selection using 26 types of machine learning datasets.
arXiv Detail & Related papers (2024-02-13T06:39:15Z) - Optimal Static Mutation Strength Distributions for the $(1+\lambda)$
Evolutionary Algorithm on OneMax [1.0965065178451106]
We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive.
We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive.
arXiv Detail & Related papers (2021-02-09T16:56:25Z) - Particle Swarm Optimization: Fundamental Study and its Application to
Optimization and to Jetty Scheduling Problems [0.0]
The advantages of evolutionary algorithms with respect to traditional methods have been greatly discussed in the literature.
While particle swarms share such advantages, they outperform evolutionary algorithms in that they require lower computational cost and easier implementation.
This paper does not intend to study their tuning, general-purpose settings are taken from previous studies, and virtually the same algorithm is used to optimize a variety of notably different problems.
arXiv Detail & Related papers (2021-01-25T02:06:30Z) - Particle Swarm Optimization: Development of a General-Purpose Optimizer [0.0]
The particle swarm optimization (PSO) method is sometimes viewed as another evolutionary algorithm because of their many similarities.
This paper deals with three important aspects of the method: the influence of the parameters' tuning on the behaviour of the system; the design of stopping criteria so that the reliability of the solution found can be somehow estimated and computational cost can be saved.
arXiv Detail & Related papers (2021-01-25T00:35:18Z) - Divide and Learn: A Divide and Conquer Approach for Predict+Optimize [50.03608569227359]
The predict+optimize problem combines machine learning ofproblem coefficients with a optimization prob-lem that uses the predicted coefficients.
We show how to directlyexpress the loss of the optimization problem in terms of thepredicted coefficients as a piece-wise linear function.
We propose a novel divide and algorithm to tackle optimization problems without this restriction and predict itscoefficients using the optimization loss.
arXiv Detail & Related papers (2020-12-04T00:26:56Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - Using models to improve optimizers for variational quantum algorithms [1.7475326826331605]
Variational quantum algorithms are a leading candidate for early applications on noisy intermediate-scale quantum computers.
These algorithms depend on a classical optimization outer-loop that minimizes some function of a parameterized quantum circuit.
We introduce two optimization methods and numerically compare their performance with common methods in use today.
arXiv Detail & Related papers (2020-05-22T05:23:23Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.