JUMBO: Scalable Multi-task Bayesian Optimization using Offline Data
- URL: http://arxiv.org/abs/2106.00942v1
- Date: Wed, 2 Jun 2021 05:03:38 GMT
- Title: JUMBO: Scalable Multi-task Bayesian Optimization using Offline Data
- Authors: Kourosh Hakhamaneshi, Pieter Abbeel, Vladimir Stojanovic, Aditya
Grover
- Abstract summary: We propose JUMBO, an MBO algorithm that sidesteps limitations by querying additional data.
We show that it achieves no-regret under conditions analogous to GP-UCB.
Empirically, we demonstrate significant performance improvements over existing approaches on two real-world optimization problems.
- Score: 86.8949732640035
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The goal of Multi-task Bayesian Optimization (MBO) is to minimize the number
of queries required to accurately optimize a target black-box function, given
access to offline evaluations of other auxiliary functions. When offline
datasets are large, the scalability of prior approaches comes at the expense of
expressivity and inference quality. We propose JUMBO, an MBO algorithm that
sidesteps these limitations by querying additional data based on a combination
of acquisition signals derived from training two Gaussian Processes (GP): a
cold-GP operating directly in the input domain and a warm-GP that operates in
the feature space of a deep neural network pretrained using the offline data.
Such a decomposition can dynamically control the reliability of information
derived from the online and offline data and the use of pretrained neural
networks permits scalability to large offline datasets. Theoretically, we
derive regret bounds for JUMBO and show that it achieves no-regret under
conditions analogous to GP-UCB (Srinivas et. al. 2010). Empirically, we
demonstrate significant performance improvements over existing approaches on
two real-world optimization problems: hyper-parameter optimization and
automated circuit design.
Related papers
- Recursive Gaussian Process State Space Model [4.572915072234487]
We propose a new online GPSSM method with adaptive capabilities for both operating domains and GP hyper parameters.
Online selection algorithm for inducing points is developed based on informative criteria to achieve lightweight learning.
Comprehensive evaluations on both synthetic and real-world datasets demonstrate the superior accuracy, computational efficiency, and adaptability of our method.
arXiv Detail & Related papers (2024-11-22T02:22:59Z) - Online Parallel Multi-Task Relationship Learning via Alternating Direction Method of Multipliers [37.859185005986056]
Online multi-task learning (OMTL) enhances streaming data processing by leveraging the inherent relations among multiple tasks.
This study proposes a novel OMTL framework based on the alternating direction multiplier method (ADMM), a recent breakthrough in optimization suitable for the distributed computing environment.
arXiv Detail & Related papers (2024-11-09T10:20:13Z) - Optimization Proxies using Limited Labeled Data and Training Time -- A Semi-Supervised Bayesian Neural Network Approach [2.943640991628177]
Constrained optimization problems arise in various engineering system operations such as inventory management electric power grids.
This work introduces a learning scheme using Bayesian Networks (BNNs) to solve constrained optimization problems under limited data and restricted model times.
We show that the proposed learning method outperforms conventional BNN and deep neural network (DNN) architectures.
arXiv Detail & Related papers (2024-10-04T02:10:20Z) - Learning Optimal Linear Precoding for Cell-Free Massive MIMO with GNN [15.271970287767164]
We develop a graph neural network (GNN) to compute, within a time budget of 1 to 2 milliseconds required by practical systems.
We show that it achieves near optimal spectral efficiency in a range of scenarios with different number of APs and UEs.
arXiv Detail & Related papers (2024-06-06T19:29:33Z) - Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - Accelerating Scalable Graph Neural Network Inference with Node-Adaptive
Propagation [80.227864832092]
Graph neural networks (GNNs) have exhibited exceptional efficacy in a diverse array of applications.
The sheer size of large-scale graphs presents a significant challenge to real-time inference with GNNs.
We propose an online propagation framework and two novel node-adaptive propagation methods.
arXiv Detail & Related papers (2023-10-17T05:03:00Z) - A Multi-Head Ensemble Multi-Task Learning Approach for Dynamical
Computation Offloading [62.34538208323411]
We propose a multi-head ensemble multi-task learning (MEMTL) approach with a shared backbone and multiple prediction heads (PHs)
MEMTL outperforms benchmark methods in both the inference accuracy and mean square error without requiring additional training data.
arXiv Detail & Related papers (2023-09-02T11:01:16Z) - Efficient Graph Neural Network Inference at Large Scale [54.89457550773165]
Graph neural networks (GNNs) have demonstrated excellent performance in a wide range of applications.
Existing scalable GNNs leverage linear propagation to preprocess the features and accelerate the training and inference procedure.
We propose a novel adaptive propagation order approach that generates the personalized propagation order for each node based on its topological information.
arXiv Detail & Related papers (2022-11-01T14:38:18Z) - MOPS-Net: A Matrix Optimization-driven Network forTask-Oriented 3D Point
Cloud Downsampling [86.42733428762513]
MOPS-Net is a novel interpretable deep learning-based method for matrix optimization.
We show that MOPS-Net can achieve favorable performance against state-of-the-art deep learning-based methods over various tasks.
arXiv Detail & Related papers (2020-05-01T14:01:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.