HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural
Networks
- URL: http://arxiv.org/abs/2302.00374v1
- Date: Wed, 1 Feb 2023 11:12:35 GMT
- Title: HOAX: A Hyperparameter Optimization Algorithm Explorer for Neural
Networks
- Authors: Albert Thie, Maximilian F. S. J. Menger, Shirin Faraji
- Abstract summary: The bottleneck for trajectory-based methods to study photoinduced processes is still the huge number of electronic structure calculations.
We present an innovative solution, in which the amount of electronic structure calculations is drastically reduced, by employing machine learning algorithms and methods borrowed from the realm of artificial intelligence.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computational chemistry has become an important tool to predict and
understand molecular properties and reactions. Even though recent years have
seen a significant growth in new algorithms and computational methods that
speed up quantum chemical calculations, the bottleneck for trajectory-based
methods to study photoinduced processes is still the huge number of electronic
structure calculations. In this work, we present an innovative solution, in
which the amount of electronic structure calculations is drastically reduced,
by employing machine learning algorithms and methods borrowed from the realm of
artificial intelligence. However, applying these algorithms effectively
requires finding optimal hyperparameters, which remains a challenge itself.
Here we present an automated user-friendly framework, HOAX, to perform the
hyperparameter optimization for neural networks, which bypasses the need for a
lengthy manual process. The neural network generated potential energy surfaces
(PESs) reduces the computational costs compared to the ab initio-based PESs. We
perform a comparative investigation on the performance of different
hyperparameter optimiziation algorithms, namely grid search, simulated
annealing, genetic algorithm, and bayesian optimizer in finding the optimal
hyperparameters necessary for constructing the well-performing neural network
in order to fit the PESs of small organic molecules. Our results show that this
automated toolkit not only facilitate a straightforward way to perform the
hyperparameter optimization but also the resulting neural networks-based
generated PESs are in reasonable agreement with the ab initio-based PESs.
Related papers
- NN-AE-VQE: Neural network parameter prediction on autoencoded variational quantum eigensolvers [1.7400502482492273]
In recent years, the field of quantum computing has become significantly more mature.
We present an auto-encoded VQE with neural-network predictions: NN-AE-VQE.
We demonstrate these methods on a $H$ molecule, achieving chemical accuracy.
arXiv Detail & Related papers (2024-11-23T23:09:22Z) - Gradual Optimization Learning for Conformational Energy Minimization [69.36925478047682]
Gradual Optimization Learning Framework (GOLF) for energy minimization with neural networks significantly reduces the required additional data.
Our results demonstrate that the neural network trained with GOLF performs on par with the oracle on a benchmark of diverse drug-like molecules.
arXiv Detail & Related papers (2023-11-05T11:48:08Z) - Exploring accurate potential energy surfaces via integrating variational
quantum eigensovler with machine learning [8.19234058079321]
We show in this work that variational quantum algorithms can be integrated with machine learning (ML) techniques.
We encode the molecular geometry information into a deep neural network (DNN) for representing parameters of the variational quantum eigensolver (VQE)
arXiv Detail & Related papers (2022-06-08T01:43:56Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Genealogical Population-Based Training for Hyperparameter Optimization [1.0514231683620516]
We experimentally demonstrate that our method cuts down by 2 to 3 times the computational cost required.
Our method is search-algorithm so that the inner search routine can be any search algorithm like TPE, GP, CMA or random search.
arXiv Detail & Related papers (2021-09-30T08:49:41Z) - Ps and Qs: Quantization-aware pruning for efficient low latency neural
network inference [56.24109486973292]
We study the interplay between pruning and quantization during the training of neural networks for ultra low latency applications.
We find that quantization-aware pruning yields more computationally efficient models than either pruning or quantization alone for our task.
arXiv Detail & Related papers (2021-02-22T19:00:05Z) - Online hyperparameter optimization by real-time recurrent learning [57.01871583756586]
Our framework takes advantage of the analogy between hyperparameter optimization and parameter learning in neural networks (RNNs)
It adapts a well-studied family of online learning algorithms for RNNs to tune hyperparameters and network parameters simultaneously.
This procedure yields systematically better generalization performance compared to standard methods, at a fraction of wallclock time.
arXiv Detail & Related papers (2021-02-15T19:36:18Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - A Study of Genetic Algorithms for Hyperparameter Optimization of Neural
Networks in Machine Translation [0.0]
We propose an automatic tuning method modeled after Darwin's Survival of the Fittest Theory via a Genetic Algorithm.
Research results show that the proposed method, a GA, outperforms a random selection of hyper parameters.
arXiv Detail & Related papers (2020-09-15T02:24:16Z) - Optimizing Memory Placement using Evolutionary Graph Reinforcement
Learning [56.83172249278467]
We introduce Evolutionary Graph Reinforcement Learning (EGRL), a method designed for large search spaces.
We train and validate our approach directly on the Intel NNP-I chip for inference.
We additionally achieve 28-78% speed-up compared to the native NNP-I compiler on all three workloads.
arXiv Detail & Related papers (2020-07-14T18:50:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.