Benchmarking Differential Evolution on a Quantum Simulator
- URL: http://arxiv.org/abs/2311.03128v1
- Date: Mon, 6 Nov 2023 14:27:00 GMT
- Title: Benchmarking Differential Evolution on a Quantum Simulator
- Authors: Parthasarathy Srinivasan
- Abstract summary: Differential Evolution (DE) can be used to compute the minima of functions such as the rastrigin function and rosenbrock function.
This work is an attempt to study the result of applying the DE method on these functions with candidate individuals generated on classical Turing modeled computation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The use of Evolutionary Algorithms (EA) for solving
Mathematical/Computational Optimization Problems is inspired by the biological
processes of Evolution. Few of the primitives involved in the Evolutionary
process/paradigm are selection of 'Fit' individuals (from a population sample)
for retention, cloning, mutation, discarding, breeding, crossover etc. In the
Evolutionary Algorithm abstraction, the individuals are deemed to be solution
candidates to an Optimization problem and additional solution(/sets) are built
by applying analogies to the above primitives (cloning, mutation etc.) by means
of evaluating a 'Fitness' function/criterion. One such algorithm is
Differential Evolution (DE) which can be used to compute the minima of
functions such as the rastrigin function and rosenbrock function. This work is
an attempt to study the result of applying the DE method on these functions
with candidate individuals generated on classical Turing modeled computation
and comparing the same with those on state of the art Quantum computation.The
study benchmarks the convergence of these functions by varying the parameters
initialized and reports timing, convergence, and resource utilization results.
Related papers
- Fast Genetic Algorithm for feature selection -- A qualitative approximation approach [5.279268784803583]
We propose a two-stage surrogate-assisted evolutionary approach to address the computational issues arising from using Genetic Algorithm (GA) for feature selection.
We show that CHCQX converges faster to feature subset solutions of significantly higher accuracy, particularly for large datasets with over 100K instances.
arXiv Detail & Related papers (2024-04-05T10:15:24Z) - Frog-Snake prey-predation Relationship Optimization (FSRO) : A novel nature-inspired metaheuristic algorithm for feature selection [0.0]
This study proposes the Frog-Snake prey-predation Relationship Optimization (FSRO) algorithm.
It is inspired by the prey-predation relationship between frogs and snakes for application to discrete optimization problems.
The proposed algorithm conducts computational experiments on feature selection using 26 types of machine learning datasets.
arXiv Detail & Related papers (2024-02-13T06:39:15Z) - Comparison of Single- and Multi- Objective Optimization Quality for
Evolutionary Equation Discovery [77.34726150561087]
Evolutionary differential equation discovery proved to be a tool to obtain equations with less a priori assumptions.
The proposed comparison approach is shown on classical model examples -- Burgers equation, wave equation, and Korteweg - de Vries equation.
arXiv Detail & Related papers (2023-06-29T15:37:19Z) - Adaptive LASSO estimation for functional hidden dynamic geostatistical
model [69.10717733870575]
We propose a novel model selection algorithm based on a penalized maximum likelihood estimator (PMLE) for functional hiddenstatistical models (f-HD)
The algorithm is based on iterative optimisation and uses an adaptive least absolute shrinkage and selector operator (GMSOLAS) penalty function, wherein the weights are obtained by the unpenalised f-HD maximum-likelihood estimators.
arXiv Detail & Related papers (2022-08-10T19:17:45Z) - Improving RNA Secondary Structure Design using Deep Reinforcement
Learning [69.63971634605797]
We propose a new benchmark of applying reinforcement learning to RNA sequence design, in which the objective function is defined to be the free energy in the sequence's secondary structure.
We show results of the ablation analysis that we do for these algorithms, as well as graphs indicating the algorithm's performance across batches.
arXiv Detail & Related papers (2021-11-05T02:54:06Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - AdaLead: A simple and robust adaptive greedy search algorithm for
sequence design [55.41644538483948]
We develop an easy-to-directed, scalable, and robust evolutionary greedy algorithm (AdaLead)
AdaLead is a remarkably strong benchmark that out-competes more complex state of the art approaches in a variety of biologically motivated sequence design challenges.
arXiv Detail & Related papers (2020-10-05T16:40:38Z) - Genetic optimization algorithms applied toward mission computability
models [0.3655021726150368]
Genetic algorithms are computations based and low cost to compute.
We describe our genetic optimization algorithms to a mission-critical and constraints-aware problem.
arXiv Detail & Related papers (2020-05-27T00:45:24Z) - Obtaining Basic Algebra Formulas with Genetic Programming and Functional
Rewriting [0.0]
We use functional programming rewriting to boost inductive genetic programming.
Parents are selected following a tournament selection mechanism and the next population is obtained following a steady-state strategy.
We compare the performance of our technique in a set of hard problems (for classical genetic programming)
arXiv Detail & Related papers (2020-05-03T23:32:36Z) - Devolutionary genetic algorithms with application to the minimum
labeling Steiner tree problem [0.0]
This paper characterizes and discusses devolutionary genetic algorithms and evaluates their performances in solving the minimum labeling Steiner tree (MLST) problem.
We define devolutionary algorithms as the process of reaching a feasible solution by devolving a population of super-optimal unfeasible solutions over time.
We show how classical evolutionary concepts, such as crossing, mutation and fitness can be adapted to aim at reaching an optimal or close-to-optimal solution.
arXiv Detail & Related papers (2020-04-18T13:27:28Z) - The data-driven physical-based equations discovery using evolutionary
approach [77.34726150561087]
We describe the algorithm for the mathematical equations discovery from the given observations data.
The algorithm combines genetic programming with the sparse regression.
It could be used for governing analytical equation discovery as well as for partial differential equations (PDE) discovery.
arXiv Detail & Related papers (2020-04-03T17:21:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.