A Study of Genetic Algorithms for Hyperparameter Optimization of Neural
Networks in Machine Translation
- URL: http://arxiv.org/abs/2009.08928v1
- Date: Tue, 15 Sep 2020 02:24:16 GMT
- Title: A Study of Genetic Algorithms for Hyperparameter Optimization of Neural
Networks in Machine Translation
- Authors: Keshav Ganapathy
- Abstract summary: We propose an automatic tuning method modeled after Darwin's Survival of the Fittest Theory via a Genetic Algorithm.
Research results show that the proposed method, a GA, outperforms a random selection of hyper parameters.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With neural networks having demonstrated their versatility and benefits, the
need for their optimal performance is as prevalent as ever. A defining
characteristic, hyperparameters, can greatly affect its performance. Thus
engineers go through a process, tuning, to identify and implement optimal
hyperparameters. That being said, excess amounts of manual effort are required
for tuning network architectures, training configurations, and preprocessing
settings such as Byte Pair Encoding (BPE). In this study, we propose an
automatic tuning method modeled after Darwin's Survival of the Fittest Theory
via a Genetic Algorithm (GA). Research results show that the proposed method, a
GA, outperforms a random selection of hyperparameters.
Related papers
- A Comparative Study of Hyperparameter Tuning Methods [0.0]
Tree-structured Parzen Estimator (TPE), Genetic Search, and Random Search are evaluated across regression and classification tasks.
Random Search excelled in regression tasks, while TPE was more effective for classification tasks.
arXiv Detail & Related papers (2024-08-29T10:35:07Z) - Circuit-centric Genetic Algorithm (CGA) for Analog and Radio-Frequency Circuit Optimization [3.0996501197166975]
This paper presents an automated method for optimizing parameters in analog/high-frequency circuits.
The design target includes a reduction of power consumption and noise figure and an increase in conversion gain.
The concept of the Circuit-centric Genetic Algorithm (CGA) is proposed as a viable approach.
arXiv Detail & Related papers (2023-11-19T02:33:22Z) - Deep Ranking Ensembles for Hyperparameter Optimization [9.453554184019108]
We present a novel method that meta-learns neural network surrogates optimized for ranking the configurations' performances while modeling their uncertainty via ensembling.
In a large-scale experimental protocol comprising 12 baselines, 16 HPO search spaces and 86 datasets/tasks, we demonstrate that our method achieves new state-of-the-art results in HPO.
arXiv Detail & Related papers (2023-03-27T13:52:40Z) - Towards Learning Universal Hyperparameter Optimizers with Transformers [57.35920571605559]
We introduce the OptFormer, the first text-based Transformer HPO framework that provides a universal end-to-end interface for jointly learning policy and function prediction.
Our experiments demonstrate that the OptFormer can imitate at least 7 different HPO algorithms, which can be further improved via its function uncertainty estimates.
arXiv Detail & Related papers (2022-05-26T12:51:32Z) - AUTOMATA: Gradient Based Data Subset Selection for Compute-Efficient
Hyper-parameter Tuning [72.54359545547904]
We propose a gradient-based subset selection framework for hyper- parameter tuning.
We show that using gradient-based data subsets for hyper- parameter tuning achieves significantly faster turnaround times and speedups of 3$times$-30$times$.
arXiv Detail & Related papers (2022-03-15T19:25:01Z) - Online hyperparameter optimization by real-time recurrent learning [57.01871583756586]
Our framework takes advantage of the analogy between hyperparameter optimization and parameter learning in neural networks (RNNs)
It adapts a well-studied family of online learning algorithms for RNNs to tune hyperparameters and network parameters simultaneously.
This procedure yields systematically better generalization performance compared to standard methods, at a fraction of wallclock time.
arXiv Detail & Related papers (2021-02-15T19:36:18Z) - A Population-based Hybrid Approach to Hyperparameter Optimization for
Neural Networks [0.0]
HBRKGA is a hybrid approach that combines the Biased Random Key Genetic Algorithm with a Random Walk technique to search the hyper parameter space efficiently.
Results showed that HBRKGA could find hyper parameter configurations that outperformed the baseline methods in six out of eight datasets.
arXiv Detail & Related papers (2020-11-22T17:12:31Z) - How much progress have we made in neural network training? A New
Evaluation Protocol for Benchmarking Optimizers [86.36020260204302]
We propose a new benchmarking protocol to evaluate both end-to-end efficiency and data-addition training efficiency.
A human study is conducted to show that our evaluation protocol matches human tuning behavior better than the random search.
We then apply the proposed benchmarking framework to 7s and various tasks, including computer vision, natural language processing, reinforcement learning, and graph mining.
arXiv Detail & Related papers (2020-10-19T21:46:39Z) - Genetic-algorithm-optimized neural networks for gravitational wave
classification [0.0]
We propose a new method for hyperparameter optimization based on genetic algorithms (GAs)
We show that the GA can discover high-quality architectures when the initial hyper parameter seed values are far from a good solution.
Using genetic algorithm optimization to refine an existing network should be especially useful if the problem context changes.
arXiv Detail & Related papers (2020-10-09T03:14:20Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - An Asymptotically Optimal Multi-Armed Bandit Algorithm and
Hyperparameter Optimization [48.5614138038673]
We propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyper parameter search evaluation.
We also develop a novel hyper parameter optimization algorithm called BOSS.
Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications.
arXiv Detail & Related papers (2020-07-11T03:15:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.