EvoTorch: Scalable Evolutionary Computation in Python
- URL: http://arxiv.org/abs/2302.12600v3
- Date: Sun, 21 May 2023 16:21:03 GMT
- Title: EvoTorch: Scalable Evolutionary Computation in Python
- Authors: Nihat Engin Toklu, Timothy Atkinson, Vojt\v{e}ch Micka, Pawe{\l}
Liskowski, Rupesh Kumar Srivastava
- Abstract summary: EvoTorch is an evolutionary computation library designed to work with high-dimensional optimization problems.
EvoTorch is based on and seamlessly works with the PyTorch library, and therefore, allows the users to define their optimization problems using a well-known API.
- Score: 1.8514314381314885
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Evolutionary computation is an important component within various fields such
as artificial intelligence research, reinforcement learning, robotics,
industrial automation and/or optimization, engineering design, etc. Considering
the increasing computational demands and the dimensionalities of modern
optimization problems, the requirement for scalable, re-usable, and practical
evolutionary algorithm implementations has been growing. To address this
requirement, we present EvoTorch: an evolutionary computation library designed
to work with high-dimensional optimization problems, with GPU support and with
high parallelization capabilities. EvoTorch is based on and seamlessly works
with the PyTorch library, and therefore, allows the users to define their
optimization problems using a well-known API.
Related papers
- GPU-accelerated Evolutionary Multiobjective Optimization Using Tensorized RVEA [13.319536515278191]
We introduce a large-scale Evolutionary Reference Vector Guided Algorithm (TensorRVEA) for harnessing the advancements of the GPU acceleration.
In numerical benchmark tests involving large-scale populations and problem dimensions,RVEA consistently demonstrates high computational performance, achieving up to over 1000$times$ speedups.
arXiv Detail & Related papers (2024-04-01T15:04:24Z) - Guided Evolution with Binary Discriminators for ML Program Search [64.44893463120584]
We propose guiding evolution with a binary discriminator, trained online to distinguish which program is better given a pair of programs.
We demonstrate our method can speed up evolution across a set of diverse problems including a 3.7x speedup on the symbolic search for MLs and a 4x speedup for RL loss functions.
arXiv Detail & Related papers (2024-02-08T16:59:24Z) - RWKV: Reinventing RNNs for the Transformer Era [54.716108899349614]
We propose a novel model architecture that combines the efficient parallelizable training of transformers with the efficient inference of RNNs.
We scale our models as large as 14 billion parameters, by far the largest dense RNN ever trained, and find RWKV performs on par with similarly sized Transformers.
arXiv Detail & Related papers (2023-05-22T13:57:41Z) - Optimization of a Hydrodynamic Computational Reservoir through Evolution [58.720142291102135]
We interface with a model of a hydrodynamic system, under development by a startup, as a computational reservoir.
We optimized the readout times and how inputs are mapped to the wave amplitude or frequency using an evolutionary search algorithm.
Applying evolutionary methods to this reservoir system substantially improved separability on an XNOR task, in comparison to implementations with hand-selected parameters.
arXiv Detail & Related papers (2023-04-20T19:15:02Z) - EvoX: A Distributed GPU-accelerated Framework for Scalable Evolutionary
Computation [40.71953374838183]
EvoX is a computing framework tailored for automated, distributed, and heterogeneous execution of EC algorithms.
At the core of EvoX lies a unique programming model to streamline the development of parallelizable EC algorithms.
EvoX offers comprehensive support for a diverse set of benchmark problems, ranging from dozens of numerical test functions to hundreds of reinforcement learning tasks.
arXiv Detail & Related papers (2023-01-29T15:00:16Z) - evosax: JAX-based Evolution Strategies [0.0]
We release evosax: a JAX-based library of evolutionary optimization algorithms.
evosax implements 30 evolutionary optimization algorithms including finite-difference-based, estimation-of-distribution evolution strategies and various genetic algorithms.
It is designed in a modular fashion and allows for flexible usage via a simple ask-evaluate-tell API.
arXiv Detail & Related papers (2022-12-08T10:34:42Z) - NeuroEvo: A Cloud-based Platform for Automated Design and Training of
Neural Networks using Evolutionary and Particle Swarm Algorithms [0.0]
This paper introduces a new web platform, NeuroEvo, that allows users to interactively design and train neural network classifiers.
The classification problem and training data are provided by the user and, upon completion of the training process, the best classifier is made available to download and implement in Python, Java, and JavaScript.
arXiv Detail & Related papers (2022-10-01T14:10:43Z) - Optimizing Memory Placement using Evolutionary Graph Reinforcement
Learning [56.83172249278467]
We introduce Evolutionary Graph Reinforcement Learning (EGRL), a method designed for large search spaces.
We train and validate our approach directly on the Intel NNP-I chip for inference.
We additionally achieve 28-78% speed-up compared to the native NNP-I compiler on all three workloads.
arXiv Detail & Related papers (2020-07-14T18:50:12Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - PHOTONAI -- A Python API for Rapid Machine Learning Model Development [2.414341608751139]
PHOTONAI is a high-level Python API designed to simplify and accelerate machine learning model development.
It functions as a unifying framework allowing the user to easily access and combine algorithms from different toolboxes into custom algorithm sequences.
arXiv Detail & Related papers (2020-02-13T10:33:05Z) - PolyScientist: Automatic Loop Transformations Combined with Microkernels
for Optimization of Deep Learning Primitives [55.79741270235602]
We develop a hybrid solution to the development of deep learning kernels.
We use the advanced polyhedral technology to automatically tune the outer loops for performance.
arXiv Detail & Related papers (2020-02-06T08:02:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.