A Fresh Approach to Evaluate Performance in Distributed Parallel Genetic
Algorithms
- URL: http://arxiv.org/abs/2106.09922v1
- Date: Fri, 18 Jun 2021 05:07:14 GMT
- Title: A Fresh Approach to Evaluate Performance in Distributed Parallel Genetic
Algorithms
- Authors: Tomohiro Harada and Enrique Alba and Gabriel Luque
- Abstract summary: This work proposes a novel approach to evaluate and analyze the behavior of multi-population parallel genetic algorithms (PGAs)
In particular, we deeply study their numerical and computational behavior by proposing a mathematical model representing the observed performance curves.
The conclusions based on the real figures and the numerical models fitting them represent a fresh way of understanding their speed-up, running time, and numerical effort.
- Score: 5.375634674639956
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work proposes a novel approach to evaluate and analyze the behavior of
multi-population parallel genetic algorithms (PGAs) when running on a cluster
of multi-core processors. In particular, we deeply study their numerical and
computational behavior by proposing a mathematical model representing the
observed performance curves. In them, we discuss the emerging mathematical
descriptions of PGA performance instead of, e.g., individual isolated results
subject to visual inspection, for a better understanding of the effects of the
number of cores used (scalability), their migration policy (the migration gap,
in this paper), and the features of the solved problem (type of encoding and
problem size). The conclusions based on the real figures and the numerical
models fitting them represent a fresh way of understanding their speed-up,
running time, and numerical effort, allowing a comparison based on a few
meaningful numeric parameters. This represents a set of conclusions beyond the
usual textual lessons found in past works on PGAs. It can be used as an
estimation tool for the future performance of the algorithms and a way of
finding out their limitations.
Related papers
- Polynomial Chaos Expanded Gaussian Process [2.287415292857564]
In complex and unknown processes, global models are initially generated over the entire experimental space.
This study addresses the need for models that effectively represent both global and local experimental spaces.
arXiv Detail & Related papers (2024-05-02T07:11:05Z) - Sparse Variational Student-t Processes [8.46450148172407]
Student-t Processes are used to model heavy-tailed distributions and datasets with outliers.
We propose a sparse representation framework to allow Student-t Processes to be more flexible for real-world datasets.
We evaluate two proposed approaches on various synthetic and real-world datasets from UCI and Kaggle.
arXiv Detail & Related papers (2023-12-09T12:55:20Z) - The Languini Kitchen: Enabling Language Modelling Research at Different
Scales of Compute [66.84421705029624]
We introduce an experimental protocol that enables model comparisons based on equivalent compute, measured in accelerator hours.
We pre-process an existing large, diverse, and high-quality dataset of books that surpasses existing academic benchmarks in quality, diversity, and document length.
This work also provides two baseline models: a feed-forward model derived from the GPT-2 architecture and a recurrent model in the form of a novel LSTM with ten-fold throughput.
arXiv Detail & Related papers (2023-09-20T10:31:17Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Generalizing Backpropagation for Gradient-Based Interpretability [103.2998254573497]
We show that the gradient of a model is a special case of a more general formulation using semirings.
This observation allows us to generalize the backpropagation algorithm to efficiently compute other interpretable statistics.
arXiv Detail & Related papers (2023-07-06T15:19:53Z) - A Computational Model for Logical Analysis of Data [0.0]
LAD constitutes an interesting rule-based learning alternative to classic statistical learning techniques.
We propose several models for representing the data set of observations, according to the information we have on it.
Analytic Combinatorics allows us to express the desired probabilities as ratios of generating functions coefficients.
arXiv Detail & Related papers (2022-07-12T16:47:59Z) - Benchmarking Node Outlier Detection on Graphs [90.29966986023403]
Graph outlier detection is an emerging but crucial machine learning task with numerous applications.
We present the first comprehensive unsupervised node outlier detection benchmark for graphs called UNOD.
arXiv Detail & Related papers (2022-06-21T01:46:38Z) - Fast and Provably Convergent Algorithms for Gromov-Wasserstein in Graph
Learning [37.89640056739607]
Two proposed algorithms, called Bregman Alternating Projected Gradient (BAPG) and hybrid Bregman Proximal Gradient (hBPG) are proven to be (linearly) convergent.
We provide comprehensive experiments to validate the effectiveness of our methods on a host of tasks, including graph alignment, graph partition, and shape matching.
arXiv Detail & Related papers (2022-05-17T06:26:54Z) - Quantum Algorithms for Data Representation and Analysis [68.754953879193]
We provide quantum procedures that speed-up the solution of eigenproblems for data representation in machine learning.
The power and practical use of these subroutines is shown through new quantum algorithms, sublinear in the input matrix's size, for principal component analysis, correspondence analysis, and latent semantic analysis.
Results show that the run-time parameters that do not depend on the input's size are reasonable and that the error on the computed model is small, allowing for competitive classification performances.
arXiv Detail & Related papers (2021-04-19T00:41:43Z) - Learning outside the Black-Box: The pursuit of interpretable models [78.32475359554395]
This paper proposes an algorithm that produces a continuous global interpretation of any given continuous black-box function.
Our interpretation represents a leap forward from the previous state of the art.
arXiv Detail & Related papers (2020-11-17T12:39:44Z) - MAGMA: Inference and Prediction with Multi-Task Gaussian Processes [4.368185344922342]
A novel multi-task Gaussian process (GP) framework is proposed, by using a common mean process for sharing information across tasks.
Our overall algorithm is called textscMagma (standing for Multi tAsk Gaussian processes with common MeAn)
arXiv Detail & Related papers (2020-07-21T11:43:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.