Beyond Intuition, a Framework for Applying GPs to Real-World Data
- URL: http://arxiv.org/abs/2307.03093v2
- Date: Mon, 17 Jul 2023 08:40:34 GMT
- Title: Beyond Intuition, a Framework for Applying GPs to Real-World Data
- Authors: Kenza Tazi, Jihao Andreas Lin, Ross Viljoen, Alex Gardner, ST John,
Hong Ge, Richard E. Turner
- Abstract summary: We propose a framework to identify the suitability of GPs to a given problem and how to set up a robust and well-specified GP model.
We apply the framework to a case study of glacier elevation change yielding more accurate results at test time.
- Score: 21.504659500727985
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian Processes (GPs) offer an attractive method for regression over
small, structured and correlated datasets. However, their deployment is
hindered by computational costs and limited guidelines on how to apply GPs
beyond simple low-dimensional datasets. We propose a framework to identify the
suitability of GPs to a given problem and how to set up a robust and
well-specified GP model. The guidelines formalise the decisions of experienced
GP practitioners, with an emphasis on kernel design and options for
computational scalability. The framework is then applied to a case study of
glacier elevation change yielding more accurate results at test time.
Related papers
- Interactive Segmentation as Gaussian Process Classification [58.44673380545409]
Click-based interactive segmentation (IS) aims to extract the target objects under user interaction.
Most of the current deep learning (DL)-based methods mainly follow the general pipelines of semantic segmentation.
We propose to formulate the IS task as a Gaussian process (GP)-based pixel-wise binary classification model on each image.
arXiv Detail & Related papers (2023-02-28T14:01:01Z) - Revisiting Active Sets for Gaussian Process Decoders [0.0]
We develop a new estimate of the log-marginal likelihood based on recently discovered links to cross-validation.
We demonstrate that the resulting active sets (SAS) approximation significantly improves the robustness of GP decoder training.
arXiv Detail & Related papers (2022-09-10T10:49:31Z) - Weighted Ensembles for Active Learning with Adaptivity [60.84896785303314]
This paper presents an ensemble of GP models with weights adapted to the labeled data collected incrementally.
Building on this novel EGP model, a suite of acquisition functions emerges based on the uncertainty and disagreement rules.
An adaptively weighted ensemble of EGP-based acquisition functions is also introduced to further robustify performance.
arXiv Detail & Related papers (2022-06-10T11:48:49Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Deep Gaussian Processes for geophysical parameter retrieval [15.400481898772158]
This paper introduces deep Gaussian processes (DGPs) for geophysical parameter retrieval.
Unlike the standard full GP model, the DGP accounts for complicated (modular, hierarchical) processes, and improves prediction accuracy over standard full and sparse GP models.
arXiv Detail & Related papers (2020-12-07T14:44:04Z) - Sparse Gaussian Process Variational Autoencoders [24.86751422740643]
Existing approaches for performing inference in GP-DGMs do not support sparse GP approximations based on points.
We develop the sparse Gaussian processal variation autoencoder (GP-VAE) characterised by the use of partial inference networks for parameterising sparse GP approximations.
arXiv Detail & Related papers (2020-10-20T10:19:56Z) - Near-linear Time Gaussian Process Optimization with Adaptive Batching
and Resparsification [119.41129787351092]
We introduce BBKB, the first no-regret GP optimization algorithm that provably runs in near-linear time and selects candidates in batches.
We show that the same bound can be used to adaptively delay costly updates to the sparse GP approximation, achieving a near-constant per-step amortized cost.
arXiv Detail & Related papers (2020-02-23T17:43:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.