Enhanced Bayesian Optimization via Preferential Modeling of Abstract
Properties
- URL: http://arxiv.org/abs/2402.17343v1
- Date: Tue, 27 Feb 2024 09:23:13 GMT
- Title: Enhanced Bayesian Optimization via Preferential Modeling of Abstract
Properties
- Authors: Arun Kumar A V, Alistair Shilton, Sunil Gupta, Santu Rana, Stewart
Greenhill, Svetha Venkatesh
- Abstract summary: We propose a human-AI collaborative Bayesian framework to incorporate expert preferences about unmeasured abstract properties into surrogate modeling.
We provide an efficient strategy that can also handle any incorrect/misleading expert bias in preferential judgments.
- Score: 49.351577714596544
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Experimental (design) optimization is a key driver in designing and
discovering new products and processes. Bayesian Optimization (BO) is an
effective tool for optimizing expensive and black-box experimental design
processes. While Bayesian optimization is a principled data-driven approach to
experimental optimization, it learns everything from scratch and could greatly
benefit from the expertise of its human (domain) experts who often reason about
systems at different abstraction levels using physical properties that are not
necessarily directly measured (or measurable). In this paper, we propose a
human-AI collaborative Bayesian framework to incorporate expert preferences
about unmeasured abstract properties into the surrogate modeling to further
boost the performance of BO. We provide an efficient strategy that can also
handle any incorrect/misleading expert bias in preferential judgments. We
discuss the convergence behavior of our proposed framework. Our experimental
results involving synthetic functions and real-world datasets show the
superiority of our method against the baselines.
Related papers
- Constrained Multi-objective Bayesian Optimization through Optimistic Constraints Estimation [10.77641869521259]
CMOBO balances learning of the feasible region with multi-objective optimization within the feasible region in a principled manner.
We provide both theoretical justification and empirical evidence, demonstrating the efficacy of our approach on various synthetic benchmarks and real-world applications.
arXiv Detail & Related papers (2024-11-06T03:38:00Z) - Implicitly Guided Design with PropEn: Match your Data to Follow the Gradient [52.2669490431145]
PropEn is inspired by'matching', which enables implicit guidance without training a discriminator.
We show that training with a matched dataset approximates the gradient of the property of interest while remaining within the data distribution.
arXiv Detail & Related papers (2024-05-28T11:30:19Z) - Human-Algorithm Collaborative Bayesian Optimization for Engineering Systems [0.0]
We re-introduce the human back into the data-driven decision making loop by outlining an approach for collaborative Bayesian optimization.
Our methodology exploits the hypothesis that humans are more efficient at making discrete choices rather than continuous ones.
We demonstrate our approach across a number of applied and numerical case studies including bioprocess optimization and reactor geometry design.
arXiv Detail & Related papers (2024-04-16T23:17:04Z) - A General Framework for User-Guided Bayesian Optimization [51.96352579696041]
We propose ColaBO, the first Bayesian-principled framework for prior beliefs beyond the typical kernel structure.
We empirically demonstrate ColaBO's ability to substantially accelerate optimization when the prior information is accurate, and to retain approximately default performance when it is misleading.
arXiv Detail & Related papers (2023-11-24T18:27:26Z) - Design Amortization for Bayesian Optimal Experimental Design [70.13948372218849]
We build off of successful variational approaches, which optimize a parameterized variational model with respect to bounds on the expected information gain (EIG)
We present a novel neural architecture that allows experimenters to optimize a single variational model that can estimate the EIG for potentially infinitely many designs.
arXiv Detail & Related papers (2022-10-07T02:12:34Z) - Bayesian Optimization for Selecting Efficient Machine Learning Models [53.202224677485525]
We present a unified Bayesian Optimization framework for jointly optimizing models for both prediction effectiveness and training efficiency.
Experiments on model selection for recommendation tasks indicate models selected this way significantly improves model training efficiency.
arXiv Detail & Related papers (2020-08-02T02:56:30Z) - Incorporating Expert Prior Knowledge into Experimental Design via
Posterior Sampling [58.56638141701966]
Experimenters can often acquire the knowledge about the location of the global optimum.
It is unknown how to incorporate the expert prior knowledge about the global optimum into Bayesian optimization.
An efficient Bayesian optimization approach has been proposed via posterior sampling on the posterior distribution of the global optimum.
arXiv Detail & Related papers (2020-02-26T01:57:36Z) - Scalable Constrained Bayesian Optimization [10.820024633762596]
The global optimization of a high-dimensional black-box function under black-box constraints is a pervasive task in machine learning, control, and the scientific community.
We propose the scalable constrained Bayesian optimization (SCBO) algorithm that overcomes the above challenges and pushes the state-the-art.
arXiv Detail & Related papers (2020-02-20T01:48:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.