$\{\text{PF}\}^2\text{ES}$: Parallel Feasible Pareto Frontier Entropy
Search for Multi-Objective Bayesian Optimization Under Unknown Constraints
- URL: http://arxiv.org/abs/2204.05411v1
- Date: Mon, 11 Apr 2022 21:06:23 GMT
- Title: $\{\text{PF}\}^2\text{ES}$: Parallel Feasible Pareto Frontier Entropy
Search for Multi-Objective Bayesian Optimization Under Unknown Constraints
- Authors: Jixiang Qing, Henry B. Moss, Tom Dhaene, Ivo Couckuyt
- Abstract summary: We present a novel information-theoretic acquisition function for multi-objective Bayesian optimization.
$textPF2$ES provides a low cost and accurate estimate of the mutual information for the parallel setting.
We benchmark $textPF2$ES across synthetic and real-life problems.
- Score: 4.672142224503371
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present Parallel Feasible Pareto Frontier Entropy Search
($\{\text{PF}\}^2$ES) -- a novel information-theoretic acquisition function for
multi-objective Bayesian optimization. Although information-theoretic
approaches regularly provide state-of-the-art optimization, they are not yet
widely used in the context of constrained multi-objective optimization. Due to
the complexity of characterizing mutual information between candidate
evaluations and (feasible) Pareto frontiers, existing approaches must employ
severe approximations that significantly hamper their performance. By instead
using a variational lower bound, $\{\text{PF}\}^2$ES provides a low cost and
accurate estimate of the mutual information for the parallel setting (where
multiple evaluations must be chosen for each optimization step). Moreover, we
are able to interpret our proposed acquisition function by exploring direct
links with other popular multi-objective acquisition functions. We benchmark
$\{\text{PF}\}^2$ES across synthetic and real-life problems, demonstrating its
competitive performance for batch optimization across synthetic and real-world
problems including vehicle and electronic filter design.
Related papers
- $f$-PO: Generalizing Preference Optimization with $f$-divergence Minimization [91.43730624072226]
$f$-PO is a novel framework that generalizes and extends existing approaches.
We conduct experiments on state-of-the-art language models using benchmark datasets.
arXiv Detail & Related papers (2024-10-29T02:11:45Z) - Memory-Efficient Gradient Unrolling for Large-Scale Bi-level Optimization [71.35604981129838]
Traditional gradient-based bi-level optimization algorithms are ill-suited to meet the demands of large-scale applications.
We introduce $(textFG)2textU$, which achieves an unbiased approximation of the meta gradient for bi-level optimization.
$(textFG)2textU$ is inherently designed to support parallel computing, enabling it to effectively leverage large-scale distributed computing systems.
arXiv Detail & Related papers (2024-06-20T08:21:52Z) - End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - qPOTS: Efficient batch multiobjective Bayesian optimization via Pareto optimal Thompson sampling [0.0]
A sample-efficient approach to solving multiobjective optimization is via process oracle (GP) surrogates and MOBOOTS$.
We propose a Thompson sampling (TS) based approach ($qtextttPOTS$)
$qtextttPOTS$ solves a cheap multiobjective optimization on the GP posteriors with evolutionary approaches.
arXiv Detail & Related papers (2023-10-24T12:35:15Z) - Joint Entropy Search for Multi-objective Bayesian Optimization [0.0]
We propose a novel information-theoretic acquisition function for BO called Joint Entropy Search.
We showcase the effectiveness of this new approach on a range of synthetic and real-world problems in terms of the hypervolume and its weighted variants.
arXiv Detail & Related papers (2022-10-06T13:19:08Z) - Optimizer Amalgamation [124.33523126363728]
We are motivated to study a new problem named Amalgamation: how can we best combine a pool of "teacher" amalgamations into a single "student" that can have stronger problem-specific performance?
First, we define three differentiable mechanisms to amalgamate a pool of analyticals by gradient descent.
In order to reduce variance of the process, we also explore methods to stabilize the process by perturbing the target.
arXiv Detail & Related papers (2022-03-12T16:07:57Z) - Leveraging Trust for Joint Multi-Objective and Multi-Fidelity
Optimization [0.0]
This paper investigates a novel approach to Bayesian multi-objective and multi-fidelity (MOMF) optimization.
We suggest the innovative use of a trust metric to support simultaneous optimization of multiple objectives and data sources.
Our methods offer broad applicability in solving simulation problems in fields such as plasma physics and fluid dynamics.
arXiv Detail & Related papers (2021-12-27T20:55:26Z) - Multi-Fidelity Multi-Objective Bayesian Optimization: An Output Space
Entropy Search Approach [44.25245545568633]
We study the novel problem of blackbox optimization of multiple objectives via multi-fidelity function evaluations.
Our experiments on several synthetic and real-world benchmark problems show that MF-OSEMO, with both approximations, significantly improves over the state-of-the-art single-fidelity algorithms.
arXiv Detail & Related papers (2020-11-02T06:59:04Z) - Information-Theoretic Multi-Objective Bayesian Optimization with
Continuous Approximations [44.25245545568633]
We propose information-Theoretic Multi-Objective Bayesian Optimization with Continuous Approximations (iMOCA) to solve this problem.
Our experiments on diverse synthetic and real-world benchmarks show that iMOCA significantly improves over existing single-fidelity methods.
arXiv Detail & Related papers (2020-09-12T01:46:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.