Neural Process for Black-Box Model Optimization Under Bayesian Framework
- URL: http://arxiv.org/abs/2104.02487v1
- Date: Sat, 3 Apr 2021 23:35:26 GMT
- Title: Neural Process for Black-Box Model Optimization Under Bayesian Framework
- Authors: Zhongkai Shangguan and Lei Lin and Wencheng Wu and Beilei Xu
- Abstract summary: Black-box models are named in general because they can only be viewed in terms of inputs and outputs, without knowledge of the internal workings.
One powerful algorithm to solve such problem is Bayesian optimization, which can effectively estimates the model parameters that lead to the best performance.
It has been challenging for GP to optimize black-box models that need to query many observations and/or have many parameters.
We propose a general Bayesian optimization algorithm that employs a Neural Process as the surrogate model to perform black-box model optimization.
- Score: 7.455546102930911
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: There are a large number of optimization problems in physical models where
the relationships between model parameters and outputs are unknown or hard to
track. These models are named as black-box models in general because they can
only be viewed in terms of inputs and outputs, without knowledge of the
internal workings. Optimizing the black-box model parameters has become
increasingly expensive and time consuming as they have become more complex.
Hence, developing effective and efficient black-box model optimization
algorithms has become an important task. One powerful algorithm to solve such
problem is Bayesian optimization, which can effectively estimates the model
parameters that lead to the best performance, and Gaussian Process (GP) has
been one of the most widely used surrogate model in Bayesian optimization.
However, the time complexity of GP scales cubically with respect to the number
of observed model outputs, and GP does not scale well with large parameter
dimension either. Consequently, it has been challenging for GP to optimize
black-box models that need to query many observations and/or have many
parameters. To overcome the drawbacks of GP, in this study, we propose a
general Bayesian optimization algorithm that employs a Neural Process (NP) as
the surrogate model to perform black-box model optimization, namely, Neural
Process for Bayesian Optimization (NPBO). In order to validate the benefits of
NPBO, we compare NPBO with four benchmark approaches on a power system
parameter optimization problem and a series of seven benchmark Bayesian
optimization problems. The results show that the proposed NPBO performs better
than the other four benchmark approaches on the power system parameter
optimization problem and competitively on the seven benchmark problems.
Related papers
- End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Polynomial-Model-Based Optimization for Blackbox Objectives [0.0]
Black-box optimization seeks to find optimal parameters for systems such that a pre-defined objective function is minimized.
PMBO is a novel blackbox that finds the minimum by fitting a surrogate to the objective function.
PMBO is benchmarked against other state-of-the-art algorithms for a given set of artificial, analytical functions.
arXiv Detail & Related papers (2023-09-01T14:11:03Z) - Predictive Modeling through Hyper-Bayesian Optimization [60.586813904500595]
We propose a novel way of integrating model selection and BO for the single goal of reaching the function optima faster.
The algorithm moves back and forth between BO in the model space and BO in the function space, where the goodness of the recommended model is captured.
In addition to improved sample efficiency, the framework outputs information about the black-box function.
arXiv Detail & Related papers (2023-08-01T04:46:58Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - Bayesian Optimization over Permutation Spaces [30.650753803587794]
We propose and evaluate two algorithms for BO over Permutation Spaces (BOPS)
We theoretically analyze the performance of BOPS-T to show that their regret grows sub-linearly.
Our experiments on multiple synthetic and real-world benchmarks show that both BOPS-T and BOPS-H perform better than the state-of-the-art BO algorithm for spaces.
arXiv Detail & Related papers (2021-12-02T08:20:50Z) - Non-smooth Bayesian Optimization in Tuning Problems [5.768843113172494]
Building surrogate models is one common approach when we attempt to learn unknown black-box functions.
We propose a novel additive Gaussian process model called clustered Gaussian process (cGP), where the additive components are induced by clustering.
In the examples we studied, the performance can be improved by as much as 90% among repetitive experiments.
arXiv Detail & Related papers (2021-09-15T20:22:09Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z) - Bayesian Optimization for Selecting Efficient Machine Learning Models [53.202224677485525]
We present a unified Bayesian Optimization framework for jointly optimizing models for both prediction effectiveness and training efficiency.
Experiments on model selection for recommendation tasks indicate models selected this way significantly improves model training efficiency.
arXiv Detail & Related papers (2020-08-02T02:56:30Z) - Bayesian Optimization for Policy Search in High-Dimensional Systems via
Automatic Domain Selection [1.1240669509034296]
We propose to leverage results from optimal control to scale BO to higher dimensional control tasks.
We show how we can make use of a learned dynamics model in combination with a model-based controller to simplify the BO problem.
We present an experimental evaluation on real hardware, as well as simulated tasks including a 48-dimensional policy for a quadcopter.
arXiv Detail & Related papers (2020-01-21T09:04:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.