Intrinsic Bayesian Optimisation on Complex Constrained Domain
- URL: http://arxiv.org/abs/2301.12581v1
- Date: Sun, 29 Jan 2023 23:28:08 GMT
- Title: Intrinsic Bayesian Optimisation on Complex Constrained Domain
- Authors: Yuan Liu, Mu Niu, Claire Miller
- Abstract summary: Motivated by the success of Bayesian optimisation algorithms in the Euclidean space, we propose a novel approach to construct Intrinsic Bayesian optimisation (In-BO)
Data may be collected in a spatial domain but restricted to a complex or intricately structured region corresponding to a geographic feature, such as lakes.
The efficiency of In-BO is demonstrated through simulation studies on a U-shaped domain, a Bitten torus, and a real dataset from the Aral sea.
- Score: 4.164327213986953
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Motivated by the success of Bayesian optimisation algorithms in the Euclidean
space, we propose a novel approach to construct Intrinsic Bayesian optimisation
(In-BO) on manifolds with a primary focus on complex constrained domains or
irregular-shaped spaces arising as submanifolds of R2, R3 and beyond. Data may
be collected in a spatial domain but restricted to a complex or intricately
structured region corresponding to a geographic feature, such as lakes.
Traditional Bayesian Optimisation (Tra-BO) defined with a Radial basis function
(RBF) kernel cannot accommodate these complex constrained conditions. The In-BO
uses the Sparse Intrinsic Gaussian Processes (SIn-GP) surrogate model to take
into account the geometric structure of the manifold. SInGPs are constructed
using the heat kernel of the manifold which is estimated as the transition
density of the Brownian Motion on manifolds. The efficiency of In-BO is
demonstrated through simulation studies on a U-shaped domain, a Bitten torus,
and a real dataset from the Aral sea. Its performance is compared to that of
traditional BO, which is defined in Euclidean space.
Related papers
- Enforcing Latent Euclidean Geometry in Single-Cell VAEs for Manifold Interpolation [79.27003481818413]
We introduce FlatVI, a training framework that regularises the latent manifold of discrete-likelihood variational autoencoders towards Euclidean geometry.<n>By encouraging straight lines in the latent space to approximate geodesics on the decoded single-cell manifold, FlatVI enhances compatibility with downstream approaches.
arXiv Detail & Related papers (2025-07-15T23:08:14Z) - Normalizing Diffusion Kernels with Optimal Transport [4.081238502499229]
We introduce a class of smoothing operators that inherit desirable properties from Laplacians.<n>This construction enables Laplacian-like smoothing and processing of irregular data.<n>We show that the resulting operators approximate heat diffusion but also retain spectral information from the Laplacian itself.
arXiv Detail & Related papers (2025-07-08T16:42:09Z) - Return of the Latent Space COWBOYS: Re-thinking the use of VAEs for Bayesian Optimisation of Structured Spaces [13.38402522324075]
We propose a decoupled approach that trains a generative model and a Gaussian Process (GP) surrogate separately, then combines them via a simple yet principled Bayesian update rule.<n>We show that our decoupled approach improves our ability to identify high-potential candidates in molecular optimisation problems under constrained evaluation budgets.
arXiv Detail & Related papers (2025-07-05T05:53:04Z) - GMapLatent: Geometric Mapping in Latent Space [51.317738404571514]
Cross-domain generative models based on encoder-decoder AI architectures have attracted much attention in generating realistic images.
We introduce a canonical latent space representation based on geometric mapping to align the cross-domain latent spaces in a rigorous and precise manner.
Experiments on gray-scale and color images validate the efficiency, efficacy and applicability of GMapLatent.
arXiv Detail & Related papers (2025-03-30T12:02:36Z) - Modeling All Response Surfaces in One for Conditional Search Spaces [69.90317997694218]
This paper proposes a novel approach to model the response surfaces of all subspaces in one.
We introduce an attention-based deep feature extractor, capable of projecting configurations with different structures from various subspaces into a unified feature space.
arXiv Detail & Related papers (2025-01-08T03:56:06Z) - Bridging Geometric States via Geometric Diffusion Bridge [79.60212414973002]
We introduce the Geometric Diffusion Bridge (GDB), a novel generative modeling framework that accurately bridges initial and target geometric states.
GDB employs an equivariant diffusion bridge derived by a modified version of Doob's $h$-transform for connecting geometric states.
We show that GDB surpasses existing state-of-the-art approaches, opening up a new pathway for accurately bridging geometric states.
arXiv Detail & Related papers (2024-10-31T17:59:53Z) - Entanglement Renormalization for Quantum Field Theories with Discrete Wavelet Transforms [0.0]
We propose an adaptation of Entanglement Renormalization for quantum field theories using discrete wavelet transforms.
We describe two concrete implementations of our wMERA algorithm for free scalar and fermionic theories in (1+1) spacetime dimensions.
arXiv Detail & Related papers (2024-04-17T20:01:51Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - Normalizing flows for lattice gauge theory in arbitrary space-time
dimension [135.04925500053622]
Applications of normalizing flows to the sampling of field configurations in lattice gauge theory have so far been explored almost exclusively in two space-time dimensions.
We discuss masked autoregressive with tractable and unbiased Jacobian determinants, a key ingredient for scalable and exact flow-based sampling algorithms.
For concreteness, results from a proof-of-principle application to SU(3) gauge theory in four space-time dimensions are reported.
arXiv Detail & Related papers (2023-05-03T19:54:04Z) - Machine Learning and Polymer Self-Consistent Field Theory in Two Spatial
Dimensions [0.491574468325115]
A computational framework that leverages data from self-consistent field theory simulations with deep learning is presented.
A generative adversarial network (GAN) is introduced to efficiently and accurately predict saddle point, local average monomer density fields.
This GAN approach yields important savings of both memory and computational cost.
arXiv Detail & Related papers (2022-12-16T04:30:16Z) - Optimal Scaling for Locally Balanced Proposals in Discrete Spaces [65.14092237705476]
We show that efficiency of Metropolis-Hastings (M-H) algorithms in discrete spaces can be characterized by an acceptance rate that is independent of the target distribution.
Knowledge of the optimal acceptance rate allows one to automatically tune the neighborhood size of a proposal distribution in a discrete space, directly analogous to step-size control in continuous spaces.
arXiv Detail & Related papers (2022-09-16T22:09:53Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Distributed Variational Bayesian Algorithms Over Sensor Networks [6.572330981878818]
We propose two novel distributed VB algorithms for general Bayesian inference problem.
The proposed algorithms have excellent performance, which are almost as good as the corresponding centralized VB algorithm relying on all data available in a fusion center.
arXiv Detail & Related papers (2020-11-27T08:12:18Z) - High-Dimensional Bayesian Optimization via Nested Riemannian Manifolds [0.0]
We propose to exploit the geometry of non-Euclidean search spaces, which often arise in a variety of domains, to learn structure-preserving mappings.
Our approach features geometry-aware Gaussian processes that jointly learn a nested-manifold embedding and a representation of the objective function in the latent space.
arXiv Detail & Related papers (2020-10-21T11:24:11Z) - Local optimization on pure Gaussian state manifolds [63.76263875368856]
We exploit insights into the geometry of bosonic and fermionic Gaussian states to develop an efficient local optimization algorithm.
The method is based on notions of descent gradient attuned to the local geometry.
We use the presented methods to collect numerical and analytical evidence for the conjecture that Gaussian purifications are sufficient to compute the entanglement of purification of arbitrary mixed Gaussian states.
arXiv Detail & Related papers (2020-09-24T18:00:36Z) - Intrinsic Gaussian Processes on Manifolds and Their Accelerations by
Symmetry [9.773237080061815]
Existing methods primarily focus on low dimensional constrained domains for heat kernel estimation.
Our research proposes an intrinsic approach for constructing GP on general equations.
Our methodology estimates the heat kernel by simulating Brownian motion sample paths using the exponential map.
arXiv Detail & Related papers (2020-06-25T09:17:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.