Predicting the utility of search spaces for black-box optimization: a
simple, budget-aware approach
- URL: http://arxiv.org/abs/2112.08250v2
- Date: Thu, 16 Dec 2021 19:40:36 GMT
- Title: Predicting the utility of search spaces for black-box optimization: a
simple, budget-aware approach
- Authors: Setareh Ariafar, Justin Gilmer, Zachary Nado, Jasper Snoek, Rodolphe
Jenatton, George E. Dahl
- Abstract summary: Black box optimization requires specifying a search space to explore for solutions.
Finding a high quality search space can be challenging in many applications.
We present a simple scoring method based on a utility function applied to a probabilistic response surface model.
- Score: 25.07599332807319
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Black box optimization requires specifying a search space to explore for
solutions, e.g. a d-dimensional compact space, and this choice is critical for
getting the best results at a reasonable budget. Unfortunately, determining a
high quality search space can be challenging in many applications. For example,
when tuning hyperparameters for machine learning pipelines on a new problem
given a limited budget, one must strike a balance between excluding potentially
promising regions and keeping the search space small enough to be tractable.
The goal of this work is to motivate -- through example applications in tuning
deep neural networks -- the problem of predicting the quality of search spaces
conditioned on budgets, as well as to provide a simple scoring method based on
a utility function applied to a probabilistic response surface model, similar
to Bayesian optimization. We show that the method we present can compute
meaningful budget-conditional scores in a variety of situations. We also
provide experimental evidence that accurate scores can be useful in
constructing and pruning search spaces. Ultimately, we believe scoring search
spaces should become standard practice in the experimental workflow for deep
learning.
Related papers
- Characterization of Locality in Spin States and Forced Moves for
Optimizations [0.36868085124383626]
In optimization problems, the existence of local minima in energy landscapes is problematic to use to seek the global minimum.
We develop an algorithm to get out of the local minima efficiently while it does not yield the exact samplings.
As the proposed algorithm is based on a rejection-free algorithm, the computational cost is low.
arXiv Detail & Related papers (2023-12-05T07:21:00Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Bayesian Optimization for auto-tuning GPU kernels [0.0]
Finding optimal parameter configurations for GPU kernels is a non-trivial exercise for large search spaces, even when automated.
We introduce a novel contextual exploration factor as well as new acquisition functions with improved scalability, combined with an informed function selection mechanism.
arXiv Detail & Related papers (2021-11-26T11:26:26Z) - Fighting the curse of dimensionality: A machine learning approach to
finding global optima [77.34726150561087]
This paper shows how to find global optima in structural optimization problems.
By exploiting certain cost functions we either obtain the global at best or obtain superior results at worst when compared to established optimization procedures.
arXiv Detail & Related papers (2021-10-28T09:50:29Z) - AutoSpace: Neural Architecture Search with Less Human Interference [84.42680793945007]
Current neural architecture search (NAS) algorithms still require expert knowledge and effort to design a search space for network construction.
We propose a novel differentiable evolutionary framework named AutoSpace, which evolves the search space to an optimal one.
With the learned search space, the performance of recent NAS algorithms can be improved significantly compared with using previously manually designed spaces.
arXiv Detail & Related papers (2021-03-22T13:28:56Z) - Evolving Search Space for Neural Architecture Search [70.71153433676024]
We present a Neural Search-space Evolution (NSE) scheme that amplifies the results from the previous effort by maintaining an optimized search space subset.
We achieve 77.3% top-1 retrain accuracy on ImageNet with 333M FLOPs, which yielded a state-of-the-art performance.
When the latency constraint is adopted, our result also performs better than the previous best-performing mobile models with a 77.9% Top-1 retrain accuracy.
arXiv Detail & Related papers (2020-11-22T01:11:19Z) - BOP-Elites, a Bayesian Optimisation algorithm for Quality-Diversity
search [0.0]
We propose the Bayesian optimisation of Elites (BOP-Elites) algorithm.
By considering user defined regions of the feature space as 'niches' our task is to find the optimal solution in each niche.
The resulting algorithm is very effective in identifying the parts of the search space that belong to a niche in feature space, and finding the optimal solution in each niche.
arXiv Detail & Related papers (2020-05-08T23:49:13Z) - Projection & Probability-Driven Black-Box Attack [205.9923346080908]
Existing black-box attacks suffer from the need for excessive queries in the high-dimensional space.
We propose Projection & Probability-driven Black-box Attack (PPBA) to tackle this problem.
Our method requires at most 24% fewer queries with a higher attack success rate compared with state-of-the-art approaches.
arXiv Detail & Related papers (2020-05-08T03:37:50Z) - On Bayesian Search for the Feasible Space Under Computationally
Expensive Constraints [0.0]
We propose a novel acquisition function that combines the probability that a solution lies at the boundary between feasible and infeasible spaces.
Experiments confirmed the efficacy of the proposed function.
arXiv Detail & Related papers (2020-04-23T10:22:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.