Offline Model-Based Optimization via Normalized Maximum Likelihood
Estimation
- URL: http://arxiv.org/abs/2102.07970v1
- Date: Tue, 16 Feb 2021 06:04:27 GMT
- Title: Offline Model-Based Optimization via Normalized Maximum Likelihood
Estimation
- Authors: Justin Fu and Sergey Levine
- Abstract summary: We consider data-driven optimization problems where one must maximize a function given only queries at a fixed set of points.
This problem setting emerges in many domains where function evaluation is a complex and expensive process.
We propose a tractable approximation that allows us to scale our method to high-capacity neural network models.
- Score: 101.22379613810881
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work we consider data-driven optimization problems where one must
maximize a function given only queries at a fixed set of points. This problem
setting emerges in many domains where function evaluation is a complex and
expensive process, such as in the design of materials, vehicles, or neural
network architectures. Because the available data typically only covers a small
manifold of the possible space of inputs, a principal challenge is to be able
to construct algorithms that can reason about uncertainty and
out-of-distribution values, since a naive optimizer can easily exploit an
estimated model to return adversarial inputs. We propose to tackle this problem
by leveraging the normalized maximum-likelihood (NML) estimator, which provides
a principled approach to handling uncertainty and out-of-distribution inputs.
While in the standard formulation NML is intractable, we propose a tractable
approximation that allows us to scale our method to high-capacity neural
network models. We demonstrate that our method can effectively optimize
high-dimensional design problems in a variety of disciplines such as chemistry,
biology, and materials engineering.
Related papers
- Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - A Learning-Based Optimal Uncertainty Quantification Method and Its
Application to Ballistic Impact Problems [1.713291434132985]
This paper concerns the optimal (supremum and infimum) uncertainty bounds for systems where the input (or prior) measure is only partially/imperfectly known.
We demonstrate the learning based framework on the uncertainty optimization problem.
We show that the approach can be used to construct maps for the performance certificate and safety in engineering practice.
arXiv Detail & Related papers (2022-12-28T14:30:53Z) - Estimating a potential without the agony of the partition function [5.994412766684842]
Estimating a Gibbs density function given a sample is an important problem in computational statistics and statistical learning.
We propose an alternative approach based on Maximum A-Posteriori (MAP) estimators.
arXiv Detail & Related papers (2022-08-19T16:27:02Z) - RoMA: Robust Model Adaptation for Offline Model-based Optimization [115.02677045518692]
We consider the problem of searching an input maximizing a black-box objective function given a static dataset of input-output queries.
A popular approach to solving this problem is maintaining a proxy model that approximates the true objective function.
Here, the main challenge is how to avoid adversarially optimized inputs during the search.
arXiv Detail & Related papers (2021-10-27T05:37:12Z) - Data-informed Deep Optimization [3.331457049134526]
We propose a data-informed deep optimization (DiDo) approach to solve high-dimensional design problems.
We use a deep neural network (DNN) to learn the feasible region and to sample feasible points for fitting the objective function.
Our results indicate that the DiDo approach empowered by DNN is flexible and promising for solving general high-dimensional design problems in practice.
arXiv Detail & Related papers (2021-07-17T02:53:54Z) - Conservative Objective Models for Effective Offline Model-Based
Optimization [78.19085445065845]
Computational design problems arise in a number of settings, from synthetic biology to computer architectures.
We propose a method that learns a model of the objective function that lower bounds the actual value of the ground-truth objective on out-of-distribution inputs.
COMs are simple to implement and outperform a number of existing methods on a wide range of MBO problems.
arXiv Detail & Related papers (2021-07-14T17:55:28Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Model Inversion Networks for Model-Based Optimization [110.24531801773392]
We propose model inversion networks (MINs), which learn an inverse mapping from scores to inputs.
MINs can scale to high-dimensional input spaces and leverage offline logged data for both contextual and non-contextual optimization problems.
We evaluate MINs on tasks from the Bayesian optimization literature, high-dimensional model-based optimization problems over images and protein designs, and contextual bandit optimization from logged data.
arXiv Detail & Related papers (2019-12-31T18:06:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.