Design-Bench: Benchmarks for Data-Driven Offline Model-Based
Optimization
- URL: http://arxiv.org/abs/2202.08450v1
- Date: Thu, 17 Feb 2022 05:33:27 GMT
- Title: Design-Bench: Benchmarks for Data-Driven Offline Model-Based
Optimization
- Authors: Brandon Trabucco, Xinyang Geng, Aviral Kumar, Sergey Levine
- Abstract summary: Black-box model-based optimization problems are ubiquitous in a wide range of domains, such as the design of proteins, DNA sequences, aircraft, and robots.
We present Design-Bench, a benchmark for offline MBO with a unified evaluation protocol and reference implementations of recent methods.
Our benchmark includes a suite of diverse and realistic tasks derived from real-world optimization problems in biology, materials science, and robotics.
- Score: 82.02008764719896
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Black-box model-based optimization (MBO) problems, where the goal is to find
a design input that maximizes an unknown objective function, are ubiquitous in
a wide range of domains, such as the design of proteins, DNA sequences,
aircraft, and robots. Solving model-based optimization problems typically
requires actively querying the unknown objective function on design proposals,
which means physically building the candidate molecule, aircraft, or robot,
testing it, and storing the result. This process can be expensive and time
consuming, and one might instead prefer to optimize for the best design using
only the data one already has. This setting -- called offline MBO -- poses
substantial and different algorithmic challenges than more commonly studied
online techniques. A number of recent works have demonstrated success with
offline MBO for high-dimensional optimization problems using high-capacity deep
neural networks. However, the lack of standardized benchmarks in this emerging
field is making progress difficult to track. To address this, we present
Design-Bench, a benchmark for offline MBO with a unified evaluation protocol
and reference implementations of recent methods. Our benchmark includes a suite
of diverse and realistic tasks derived from real-world optimization problems in
biology, materials science, and robotics that present distinct challenges for
offline MBO. Our benchmark and reference implementations are released at
github.com/rail-berkeley/design-bench and
github.com/rail-berkeley/design-baselines.
Related papers
- Cliqueformer: Model-Based Optimization with Structured Transformers [102.55764949282906]
We develop a model that learns the structure of an MBO task and empirically leads to improved designs.
We evaluate Cliqueformer on various tasks, ranging from high-dimensional black-box functions to real-world tasks of chemical and genetic design.
arXiv Detail & Related papers (2024-10-17T00:35:47Z) - Offline Multi-Objective Optimization [23.543056729281695]
offline optimization aims to maximize a black-box objective function with a static dataset and has wide applications.
We propose a first benchmark for offline MOO, covering a range of problems from synthetic to real-world tasks.
Empirical results show improvements over the best value of the training set, demonstrating the effectiveness of offline MOO methods.
arXiv Detail & Related papers (2024-06-06T03:35:09Z) - Reinforced In-Context Black-Box Optimization [64.25546325063272]
RIBBO is a method to reinforce-learn a BBO algorithm from offline data in an end-to-end fashion.
RIBBO employs expressive sequence models to learn the optimization histories produced by multiple behavior algorithms and tasks.
Central to our method is to augment the optimization histories with textitregret-to-go tokens, which are designed to represent the performance of an algorithm based on cumulative regret over the future part of the histories.
arXiv Detail & Related papers (2024-02-27T11:32:14Z) - ROMO: Retrieval-enhanced Offline Model-based Optimization [14.277672372460785]
Data-driven black-box model-based optimization (MBO) problems arise in a number of practical application scenarios.
We propose retrieval-enhanced offline model-based optimization (ROMO)
ROMO is simple to implement and outperforms state-of-the-art approaches in the CoMBO setting.
arXiv Detail & Related papers (2023-10-11T15:04:33Z) - Large-Batch, Iteration-Efficient Neural Bayesian Design Optimization [37.339567743948955]
We present a novel Bayesian optimization framework specifically tailored to address the limitations of BO.
Our key contribution is a highly scalable, sample-based acquisition function that performs a non-dominated sorting of objectives.
We show that our acquisition function in combination with different Bayesian neural network surrogates is effective in data-intensive environments with a minimal number of iterations.
arXiv Detail & Related papers (2023-06-01T19:10:57Z) - Generative Pretraining for Black-Box Optimization [29.64357898080842]
We propose BONET, a generative framework for pretraining a novel black-box function.
In BONET, we train an autoregressive model on fixed-length trajectories derived from an offline dataset.
We rank BONET on Design-Bench, where we rank the best on average, outperforming state-of-the-art baselines.
arXiv Detail & Related papers (2022-06-22T00:54:30Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z) - JUMBO: Scalable Multi-task Bayesian Optimization using Offline Data [86.8949732640035]
We propose JUMBO, an MBO algorithm that sidesteps limitations by querying additional data.
We show that it achieves no-regret under conditions analogous to GP-UCB.
Empirically, we demonstrate significant performance improvements over existing approaches on two real-world optimization problems.
arXiv Detail & Related papers (2021-06-02T05:03:38Z) - Offline Model-Based Optimization via Normalized Maximum Likelihood
Estimation [101.22379613810881]
We consider data-driven optimization problems where one must maximize a function given only queries at a fixed set of points.
This problem setting emerges in many domains where function evaluation is a complex and expensive process.
We propose a tractable approximation that allows us to scale our method to high-capacity neural network models.
arXiv Detail & Related papers (2021-02-16T06:04:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.