Adaptive Estimation of the Number of Algorithm Runs in Stochastic Optimization
- URL: http://arxiv.org/abs/2507.01629v1
- Date: Wed, 02 Jul 2025 11:59:00 GMT
- Title: Adaptive Estimation of the Number of Algorithm Runs in Stochastic Optimization
- Authors: Tome Eftimov, Peter Korošec,
- Abstract summary: This paper introduces an empirical approach to estimating the required number of runs per problem.<n>It dynamically adjusts the number of runs during execution as an online approach.<n>The results demonstrate 82% - 95% accuracy in estimations across different algorithms.
- Score: 3.354345524478023
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Determining the number of algorithm runs is a critical aspect of experimental design, as it directly influences the experiment's duration and the reliability of its outcomes. This paper introduces an empirical approach to estimating the required number of runs per problem instance for accurate estimation of the performance of the continuous single-objective stochastic optimization algorithm. The method leverages probability theory, incorporating a robustness check to identify significant imbalances in the data distribution relative to the mean, and dynamically adjusts the number of runs during execution as an online approach. The proposed methodology was extensively tested across two algorithm portfolios (104 Differential Evolution configurations and the Nevergrad portfolio) and the COCO benchmark suite, totaling 5748000 runs. The results demonstrate 82% - 95% accuracy in estimations across different algorithms, allowing a reduction of approximately 50% in the number of runs without compromising optimization outcomes. This online calculation of required runs not only improves benchmarking efficiency, but also contributes to energy reduction, fostering a more environmentally sustainable computing ecosystem.
Related papers
- An Experimental Approach for Running-Time Estimation of Multi-objective Evolutionary Algorithms in Numerical Optimization [16.66619776655723]
We propose an experimental approach for estimating upper bounds on the running time of MOEAs without algorithmic assumptions.<n>We conduct comprehensive experiments on five representative MOEAs using the ZDT and DTLZ benchmark suites.<n>Results demonstrate the effectiveness of our approach in estimating upper bounds on the running time without requiring algorithmic or problem simplifications.
arXiv Detail & Related papers (2025-07-03T07:06:14Z) - Bayesian Optimization by Kernel Regression and Density-based Exploration [12.074000212223446]
We propose the Bayesian Optimization by Kernel regression and density-based Exploration (BOKE) algorithm.<n>BOKE uses kernel regression for efficient function approximation, kernel density for exploration, and integrates them into the confidence bound criteria to guide the optimization process.<n>We demonstrate that BOKE not only performs competitively compared to Gaussian process-based methods but also exhibits superior computational efficiency.
arXiv Detail & Related papers (2025-02-10T06:16:51Z) - Black-box Optimization with Simultaneous Statistical Inference for Optimal Performance [18.13513199455587]
Black-box optimization is often encountered for decision-making in complex systems management.<n>Our goal is to address the dual tasks of optimization and statistical inference for the optimal performance in an online fashion.
arXiv Detail & Related papers (2025-01-14T02:37:09Z) - A Quantum Genetic Algorithm Framework for the MaxCut Problem [49.59986385400411]
The proposed method introduces a Quantum Genetic Algorithm (QGA) using a Grover-based evolutionary framework and divide-and-conquer principles.<n>On complete graphs, the proposed method consistently achieves the true optimal MaxCut values, outperforming the Semidefinite Programming (SDP) approach.<n>On ErdHos-R'enyi random graphs, the QGA demonstrates competitive performance, achieving median solutions within 92-96% of the SDP results.
arXiv Detail & Related papers (2025-01-02T05:06:16Z) - Evaluating the effectiveness, reliability and efficiency of a multi-objective sequential optimization approach for building performance design [0.8168080812068832]
This paper proposes and evaluates a sequential approach for multi-objective design optimization of building geometry, fabric, HVAC system and controls for building performance.<n>The performance of the sequential approach is benchmarked against a full factorial search and compared to the NSGA-II algorithm.<n>This research indicates that a sequential optimization approach is a highly efficient and robust alternative to the standard NSGA-II algorithm.
arXiv Detail & Related papers (2024-12-13T08:00:00Z) - Testing the Efficacy of Hyperparameter Optimization Algorithms in Short-Term Load Forecasting [0.0]
We use the Panama Electricity dataset to evaluate HPO algorithms' performances on a surrogate forecasting algorithm, XGBoost, in terms of accuracy (i.e., MAPE, $R2$) and runtime.
Results reveal significant runtime advantages for HPO algorithms over Random Search.
arXiv Detail & Related papers (2024-10-19T09:08:52Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Evolving Pareto-Optimal Actor-Critic Algorithms for Generalizability and
Stability [67.8426046908398]
Generalizability and stability are two key objectives for operating reinforcement learning (RL) agents in the real world.
This paper presents MetaPG, an evolutionary method for automated design of actor-critic loss functions.
arXiv Detail & Related papers (2022-04-08T20:46:16Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Outlier-Robust Sparse Estimation via Non-Convex Optimization [73.18654719887205]
We explore the connection between high-dimensional statistics and non-robust optimization in the presence of sparsity constraints.
We develop novel and simple optimization formulations for these problems.
As a corollary, we obtain that any first-order method that efficiently converges to station yields an efficient algorithm for these tasks.
arXiv Detail & Related papers (2021-09-23T17:38:24Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.