HiBBO: HiPPO-based Space Consistency for High-dimensional Bayesian Optimisation
- URL: http://arxiv.org/abs/2510.08965v1
- Date: Fri, 10 Oct 2025 03:22:10 GMT
- Title: HiBBO: HiPPO-based Space Consistency for High-dimensional Bayesian Optimisation
- Authors: Junyu Xuan, Wenlong Chen, Yingzhen Li,
- Abstract summary: HiBBO is a novel BO framework that introduces the space consistency into the latent space construction in VAE using HiPPO.<n> Experiments on highdimensional benchmark tasks demonstrate that HiBBO outperforms existing VAEBO methods in convergence speed and solution quality.
- Score: 23.518990631999884
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian Optimisation (BO) is a powerful tool for optimising expensive blackbox functions but its effectiveness diminishes in highdimensional spaces due to sparse data and poor surrogate model scalability While Variational Autoencoder (VAE) based approaches address this by learning low-dimensional latent representations the reconstructionbased objective function often brings the functional distribution mismatch between the latent space and original space leading to suboptimal optimisation performance In this paper we first analyse the reason why reconstructiononly loss may lead to distribution mismatch and then propose HiBBO a novel BO framework that introduces the space consistency into the latent space construction in VAE using HiPPO - a method for longterm sequence modelling - to reduce the functional distribution mismatch between the latent space and original space Experiments on highdimensional benchmark tasks demonstrate that HiBBO outperforms existing VAEBO methods in convergence speed and solution quality Our work bridges the gap between high-dimensional sequence representation learning and efficient Bayesian Optimisation enabling broader applications in neural architecture search materials science and beyond.
Related papers
- Nonlinear Dimensionality Reduction Techniques for Bayesian Optimization [0.9303501974597549]
We investigate nonlinear dimensionality reduction techniques that reduce the problem to a sequence of low-dimensional Latent-Space BO (LSBO)<n>We propose some changes in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes.<n>We then couple LSBO with Sequential Domain Reduction (SDR) directly in the latent space (SDR-LSBO), yielding an algorithm that narrows the latent search domains as evidence accumulates.
arXiv Detail & Related papers (2025-10-17T08:45:38Z) - HiLAB: A Hybrid Inverse-Design Framework [0.0]
HiLAB is a new paradigm for inverse design of nanophotonic structures.<n>It addresses multi-functional device design by generating diverse freeform configurations at reduced simulation costs.
arXiv Detail & Related papers (2025-05-23T05:34:56Z) - Adaptive Linear Embedding for Nonstationary High-Dimensional Optimization [0.0]
Self-Adaptive embedding REMBO (SA-REMBO) is a novel framework that generalizes Random EMbedding Bayesian Optimization (REMBO) to support multiple random Gaussian embeddings.<n>An index variable governs the embedding choice and is jointly modeled with the latent latent via a product kernel in a surrogate.<n>We empirically demonstrate the advantage of our method across synthetic and real-world high-dimensional benchmarks, where traditional REMBO and other low-rank BO methods fail.
arXiv Detail & Related papers (2025-05-16T14:18:19Z) - Latent Bayesian Optimization via Autoregressive Normalizing Flows [17.063294409131238]
We propose a Normalizing Flow-based Bayesian Optimization (NF-BO) to solve the value discrepancy problem.<n>Our method demonstrates superior performance in molecule generation tasks, significantly outperforming both traditional and recent LBO approaches.
arXiv Detail & Related papers (2025-04-21T06:36:09Z) - Modeling All Response Surfaces in One for Conditional Search Spaces [69.90317997694218]
This paper proposes a novel approach to model the response surfaces of all subspaces in one.<n>We introduce an attention-based deep feature extractor, capable of projecting configurations with different structures from various subspaces into a unified feature space.
arXiv Detail & Related papers (2025-01-08T03:56:06Z) - Efficient High-Resolution Visual Representation Learning with State Space Model for Human Pose Estimation [60.80423207808076]
Capturing long-range dependencies while preserving high-resolution visual representations is crucial for dense prediction tasks such as human pose estimation.<n>We propose the Dynamic Visual State Space (DVSS) block, which augments visual state space models with multi-scale convolutional operations.<n>We build HRVMamba, a novel model for efficient high-resolution representation learning.
arXiv Detail & Related papers (2024-10-04T06:19:29Z) - Diffusion-BBO: Diffusion-Based Inverse Modeling for Online Black-Box Optimization [20.45482366024264]
Online black-box optimization (BBO) aims to optimize an objective function by iteratively querying a black-box oracle in a sample-efficient way.<n>We propose Diffusion-BBO, a sample-efficient online BBO framework leveraging the conditional diffusion model as the inverse surrogate model.
arXiv Detail & Related papers (2024-06-30T06:58:31Z) - Latent Energy-Based Odyssey: Black-Box Optimization via Expanded Exploration in the Energy-Based Latent Space [65.44449711359724]
High-dimensional and highly-multimodal input design space of black-box function pose inherent challenges for existing methods.
We consider finding a latent space that serves as a compressed yet accurate representation of the design-value joint space.
We propose Noise-intensified Telescoping density-Ratio Estimation scheme for variational learning of an accurate latent space model.
arXiv Detail & Related papers (2024-05-27T00:11:53Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Predictive Modeling through Hyper-Bayesian Optimization [60.586813904500595]
We propose a novel way of integrating model selection and BO for the single goal of reaching the function optima faster.
The algorithm moves back and forth between BO in the model space and BO in the function space, where the goodness of the recommended model is captured.
In addition to improved sample efficiency, the framework outputs information about the black-box function.
arXiv Detail & Related papers (2023-08-01T04:46:58Z) - Score-Guided Intermediate Layer Optimization: Fast Langevin Mixing for
Inverse Problem [97.64313409741614]
We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators.
We propose to do posterior sampling in the latent space of a pre-trained generative model.
arXiv Detail & Related papers (2022-06-18T03:47:37Z) - High-Dimensional Bayesian Optimization with Sparse Axis-Aligned
Subspaces [14.03847432040056]
We argue that a surrogate model defined on sparse axis-aligned subspaces offer an attractive compromise between flexibility and parsimony.
We demonstrate that our approach, which relies on Hamiltonian Monte Carlo for inference, can rapidly identify sparse subspaces relevant to modeling the unknown objective function.
arXiv Detail & Related papers (2021-02-27T23:06:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.