Diffusion Models for Black-Box Optimization
- URL: http://arxiv.org/abs/2306.07180v2
- Date: Mon, 21 Aug 2023 16:55:58 GMT
- Title: Diffusion Models for Black-Box Optimization
- Authors: Siddarth Krishnamoorthy, Satvik Mehul Mashkaria, Aditya Grover
- Abstract summary: We propose Denoising Diffusion Optimization Models (DDOM) for offline black-box optimization.
Given an offline dataset, DDOM learns a conditional generative model over the domain of the black-box function conditioned on the function values.
We show DDOM achieves results competitive with state-of-the-art baselines.
- Score: 29.64357898080842
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The goal of offline black-box optimization (BBO) is to optimize an expensive
black-box function using a fixed dataset of function evaluations. Prior works
consider forward approaches that learn surrogates to the black-box function and
inverse approaches that directly map function values to corresponding points in
the input domain of the black-box function. These approaches are limited by the
quality of the offline dataset and the difficulty in learning one-to-many
mappings in high dimensions, respectively. We propose Denoising Diffusion
Optimization Models (DDOM), a new inverse approach for offline black-box
optimization based on diffusion models. Given an offline dataset, DDOM learns a
conditional generative model over the domain of the black-box function
conditioned on the function values. We investigate several design choices in
DDOM, such as re-weighting the dataset to focus on high function values and the
use of classifier-free guidance at test-time to enable generalization to
function values that can even exceed the dataset maxima. Empirically, we
conduct experiments on the Design-Bench benchmark and show that DDOM achieves
results competitive with state-of-the-art baselines.
Related papers
- Diff-BBO: Diffusion-Based Inverse Modeling for Black-Box Optimization [20.45482366024264]
Black-box optimization (BBO) aims to optimize an objective function by iteratively querying a black-box oracle in a sample-efficient way.
Recent inverse modeling approaches that map objective space to the design space with conditional diffusion models have demonstrated impressive capability in learning the data manifold.
We propose Diff-BBO, an inverse approach leveraging diffusion models for online BBO problem.
arXiv Detail & Related papers (2024-06-30T06:58:31Z) - Covariance-Adaptive Sequential Black-box Optimization for Diffusion Targeted Generation [60.41803046775034]
We show how to perform user-preferred targeted generation via diffusion models with only black-box target scores of users.
Experiments on both numerical test problems and target-guided 3D-molecule generation tasks show the superior performance of our method in achieving better target scores.
arXiv Detail & Related papers (2024-06-02T17:26:27Z) - Predictive Modeling through Hyper-Bayesian Optimization [60.586813904500595]
We propose a novel way of integrating model selection and BO for the single goal of reaching the function optima faster.
The algorithm moves back and forth between BO in the model space and BO in the function space, where the goodness of the recommended model is captured.
In addition to improved sample efficiency, the framework outputs information about the black-box function.
arXiv Detail & Related papers (2023-08-01T04:46:58Z) - DREAM: Domain-free Reverse Engineering Attributes of Black-box Model [51.37041886352823]
We propose a new problem of Domain-agnostic Reverse Engineering the Attributes of a black-box target model.
We learn a domain-agnostic model to infer the attributes of a target black-box model with unknown training data.
arXiv Detail & Related papers (2023-07-20T16:25:58Z) - Generative Pretraining for Black-Box Optimization [29.64357898080842]
We propose BONET, a generative framework for pretraining a novel black-box function.
In BONET, we train an autoregressive model on fixed-length trajectories derived from an offline dataset.
We rank BONET on Design-Bench, where we rank the best on average, outperforming state-of-the-art baselines.
arXiv Detail & Related papers (2022-06-22T00:54:30Z) - How to Robustify Black-Box ML Models? A Zeroth-Order Optimization
Perspective [74.47093382436823]
We address the problem of black-box defense: How to robustify a black-box model using just input queries and output feedback?
We propose a general notion of defensive operation that can be applied to black-box models, and design it through the lens of denoised smoothing (DS)
We empirically show that ZO-AE-DS can achieve improved accuracy, certified robustness, and query complexity over existing baselines.
arXiv Detail & Related papers (2022-03-27T03:23:32Z) - Hierarchical Dynamic Filtering Network for RGB-D Salient Object
Detection [91.43066633305662]
The main purpose of RGB-D salient object detection (SOD) is how to better integrate and utilize cross-modal fusion information.
In this paper, we explore these issues from a new perspective.
We implement a kind of more flexible and efficient multi-scale cross-modal feature processing.
arXiv Detail & Related papers (2020-07-13T07:59:55Z) - Stepwise Model Selection for Sequence Prediction via Deep Kernel
Learning [100.83444258562263]
We propose a novel Bayesian optimization (BO) algorithm to tackle the challenge of model selection in this setting.
In order to solve the resulting multiple black-box function optimization problem jointly and efficiently, we exploit potential correlations among black-box functions.
We are the first to formulate the problem of stepwise model selection (SMS) for sequence prediction, and to design and demonstrate an efficient joint-learning algorithm for this purpose.
arXiv Detail & Related papers (2020-01-12T09:42:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.