Perfectionism Search Algorithm (PSA): An Efficient Meta-Heuristic
Optimization Approach
- URL: http://arxiv.org/abs/2304.11486v2
- Date: Fri, 13 Oct 2023 16:17:56 GMT
- Title: Perfectionism Search Algorithm (PSA): An Efficient Meta-Heuristic
Optimization Approach
- Authors: A. Ghodousian, M. Mollakazemiha, N. Karimian
- Abstract summary: This paper proposes a novel population-based meta-heuristic optimization algorithm, called Perfectionism Search Algorithm (PSA)
The PSA algorithm takes inspiration from one of the most popular model of perfectionism, which was proposed by Hewitt and Flett.
The obtained results confirm the high performance of the proposed algorithm in comparison to the other well-known algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper proposes a novel population-based meta-heuristic optimization
algorithm, called Perfectionism Search Algorithm (PSA), which is based on the
psychological aspects of perfectionism. The PSA algorithm takes inspiration
from one of the most popular model of perfectionism, which was proposed by
Hewitt and Flett. During each iteration of the PSA algorithm, new solutions are
generated by mimicking different types and aspects of perfectionistic behavior.
In order to have a complete perspective on the performance of PSA, the proposed
algorithm is tested with various nonlinear optimization problems, through
selection of 35 benchmark functions from the literature. The generated
solutions for these problems, were also compared with 11 well-known
meta-heuristics which had been applied to many complex and practical
engineering optimization problems. The obtained results confirm the high
performance of the proposed algorithm in comparison to the other well-known
algorithms.
Related papers
- Learning Joint Models of Prediction and Optimization [56.04498536842065]
Predict-Then-Then framework uses machine learning models to predict unknown parameters of an optimization problem from features before solving.
This paper proposes an alternative method, in which optimal solutions are learned directly from the observable features by joint predictive models.
arXiv Detail & Related papers (2024-09-07T19:52:14Z) - Benchmarking Algorithms for Submodular Optimization Problems Using
IOHProfiler [22.08617448389877]
This paper introduces a setup for benchmarking algorithms for submodular optimization problems.
The focus is on the development of iterative search algorithms with the implementation provided and integrated into IOHprofiler.
We present a range of submodular optimization problems that have been integrated into IOHprofiler and show how the setup can be used for analyzing and comparing iterative search algorithms in various settings.
arXiv Detail & Related papers (2023-02-02T23:36:23Z) - Socio-cognitive Optimization of Time-delay Control Problems using
Evolutionary Metaheuristics [89.24951036534168]
Metaheuristics are universal optimization algorithms which should be used for solving difficult problems, unsolvable by classic approaches.
In this paper we aim at constructing novel socio-cognitive metaheuristic based on castes, and apply several versions of this algorithm to optimization of time-delay system model.
arXiv Detail & Related papers (2022-10-23T22:21:10Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - A survey on multi-objective hyperparameter optimization algorithms for
Machine Learning [62.997667081978825]
This article presents a systematic survey of the literature published between 2014 and 2020 on multi-objective HPO algorithms.
We distinguish between metaheuristic-based algorithms, metamodel-based algorithms, and approaches using a mixture of both.
We also discuss the quality metrics used to compare multi-objective HPO procedures and present future research directions.
arXiv Detail & Related papers (2021-11-23T10:22:30Z) - High dimensional Bayesian Optimization Algorithm for Complex System in
Time Series [1.9371782627708491]
This paper presents a novel high dimensional Bayesian optimization algorithm.
Based on the time-dependent or dimension-dependent characteristics of the model, the proposed algorithm can reduce the dimension evenly.
To increase the final accuracy of the optimal solution, the proposed algorithm adds a local search based on a series of Adam-based steps at the final stage.
arXiv Detail & Related papers (2021-08-04T21:21:17Z) - Dynamic Cat Swarm Optimization Algorithm for Backboard Wiring Problem [0.9990687944474739]
This paper presents a powerful swarm intelligence meta-heuristic optimization algorithm called Dynamic Cat Swarm Optimization.
The proposed algorithm suggests a new method to provide a proper balance between these phases by modifying the selection scheme and the seeking mode of the algorithm.
optimization results show the effectiveness of the proposed algorithm, which ranks first compared to several well-known algorithms available in the literature.
arXiv Detail & Related papers (2021-04-27T19:41:27Z) - PAMELI: A Meta-Algorithm for Computationally Expensive Multi-Objective
Optimization Problems [0.0]
The proposed algorithm is based on solving a set of surrogate problems defined by models of the real one.
Our algorithm also performs a meta-search for optimal surrogate models and navigation strategies for the optimization landscape.
arXiv Detail & Related papers (2021-03-19T11:18:03Z) - Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization [71.03797261151605]
Adaptivity is an important yet under-studied property in modern optimization theory.
Our algorithm is proved to achieve the best-available convergence for non-PL objectives simultaneously while outperforming existing algorithms for PL objectives.
arXiv Detail & Related papers (2020-02-13T05:42:27Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.