Hierarchical shrinkage Gaussian processes: applications to computer code
emulation and dynamical system recovery
- URL: http://arxiv.org/abs/2302.00755v1
- Date: Wed, 1 Feb 2023 21:00:45 GMT
- Title: Hierarchical shrinkage Gaussian processes: applications to computer code
emulation and dynamical system recovery
- Authors: Tao Tang, Simon Mak, David Dunson
- Abstract summary: We propose a new hierarchical shrinkage GP (HierGP), which incorporates such structure via cumulative shrinkage priors within a GP framework.
We show that the HierGP implicitly embeds the well-known principles of effect sparsity, heredity and hierarchy for analysis of experiments.
We propose efficient posterior sampling algorithms for model training and prediction, and prove desirable consistency properties for the HierGP.
- Score: 5.694170341269015
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In many areas of science and engineering, computer simulations are widely
used as proxies for physical experiments, which can be infeasible or unethical.
Such simulations can often be computationally expensive, and an emulator can be
trained to efficiently predict the desired response surface. A widely-used
emulator is the Gaussian process (GP), which provides a flexible framework for
efficient prediction and uncertainty quantification. Standard GPs, however, do
not capture structured sparsity on the underlying response surface, which is
present in many applications, particularly in the physical sciences. We thus
propose a new hierarchical shrinkage GP (HierGP), which incorporates such
structure via cumulative shrinkage priors within a GP framework. We show that
the HierGP implicitly embeds the well-known principles of effect sparsity,
heredity and hierarchy for analysis of experiments, which allows our model to
identify structured sparse features from the response surface with limited
data. We propose efficient posterior sampling algorithms for model training and
prediction, and prove desirable consistency properties for the HierGP. Finally,
we demonstrate the improved performance of HierGP over existing models, in a
suite of numerical experiments and an application to dynamical system recovery.
Related papers
- A Kronecker product accelerated efficient sparse Gaussian Process
(E-SGP) for flow emulation [2.563626165548781]
This paper introduces an efficient sparse Gaussian process (E-SGP) for the surrogate modelling of fluid mechanics.
It is a further development of the approximated sparse GP algorithm, combining the concept of efficient GP (E-GP) and variational energy free sparse Gaussian process (VEF-SGP)
arXiv Detail & Related papers (2023-12-13T11:29:40Z) - Weighted Ensembles for Active Learning with Adaptivity [60.84896785303314]
This paper presents an ensemble of GP models with weights adapted to the labeled data collected incrementally.
Building on this novel EGP model, a suite of acquisition functions emerges based on the uncertainty and disagreement rules.
An adaptively weighted ensemble of EGP-based acquisition functions is also introduced to further robustify performance.
arXiv Detail & Related papers (2022-06-10T11:48:49Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Harnessing Heterogeneity: Learning from Decomposed Feedback in Bayesian
Modeling [68.69431580852535]
We introduce a novel GP regression to incorporate the subgroup feedback.
Our modified regression has provably lower variance -- and thus a more accurate posterior -- compared to previous approaches.
We execute our algorithm on two disparate social problems.
arXiv Detail & Related papers (2021-07-07T03:57:22Z) - Active Learning for Deep Gaussian Process Surrogates [0.3222802562733786]
Deep Gaussian processes (DGPs) are increasingly popular as predictive models in machine learning (ML)
Here we explore DGPs as surrogates for computer simulation experiments whose response surfaces exhibit similar characteristics.
We build up the design sequentially, limiting both expensive evaluation of the simulator code and mitigating cubic costs of DGP inference.
arXiv Detail & Related papers (2020-12-15T00:09:37Z) - Physics-informed Gaussian Process for Online Optimization of Particle
Accelerators [1.1808503330586468]
We apply a physics-informed Gaussian process to tune a complex system by conducting efficient global search.
The GP is then employed to make inferences from sequential online observations in order to optimize the system.
The ability to inform the machine-learning model with physics may have wide applications in science.
arXiv Detail & Related papers (2020-09-08T08:02:20Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.