Hierarchical Inducing Point Gaussian Process for Inter-domain
Observations
- URL: http://arxiv.org/abs/2103.00393v1
- Date: Sun, 28 Feb 2021 04:20:58 GMT
- Title: Hierarchical Inducing Point Gaussian Process for Inter-domain
Observations
- Authors: Luhuan Wu, Andrew Miller, Lauren Anderson, Geoff Pleiss, David Blei,
John Cunningham
- Abstract summary: hierarchical inducing point GP (HIP-GP) is a scalable inter-domain GP inference method.
HIP-GP is suitable for low-dimensional problems.
- Score: 9.880362989790923
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We examine the general problem of inter-domain Gaussian Processes (GPs):
problems where the GP realization and the noisy observations of that
realization lie on different domains. When the mapping between those domains is
linear, such as integration or differentiation, inference is still closed form.
However, many of the scaling and approximation techniques that our community
has developed do not apply to this setting. In this work, we introduce the
hierarchical inducing point GP (HIP-GP), a scalable inter-domain GP inference
method that enables us to improve the approximation accuracy by increasing the
number of inducing points to the millions. HIP-GP, which relies on inducing
points with grid structure and a stationary kernel assumption, is suitable for
low-dimensional problems. In developing HIP-GP, we introduce (1) a fast
whitening strategy, and (2) a novel preconditioner for conjugate gradients
which can be helpful in general GP settings.
Related papers
- Deep Transformed Gaussian Processes [0.0]
Transformed Gaussian Processes (TGPs) are processes specified by transforming samples from the joint distribution from a prior process (typically a GP) using an invertible transformation.
We propose a generalization of TGPs named Deep Transformed Gaussian Processes (DTGPs), which follows the trend of concatenating layers of processes.
Experiments conducted evaluate the proposed DTGPs in multiple regression datasets, achieving good scalability and performance.
arXiv Detail & Related papers (2023-10-27T16:09:39Z) - Spherical Inducing Features for Orthogonally-Decoupled Gaussian
Processes [7.4468224549568705]
Gaussian processes (GPs) are often compared unfavorably to deep neural networks (NNs) for lacking the ability to learn representations.
Recent efforts to bridge the gap between GPs and deep NNs have yielded a new class of inter-domain variational GPs in which the inducing variables correspond to hidden units of a feedforward NN.
arXiv Detail & Related papers (2023-04-27T09:00:02Z) - Interactive Segmentation as Gaussian Process Classification [58.44673380545409]
Click-based interactive segmentation (IS) aims to extract the target objects under user interaction.
Most of the current deep learning (DL)-based methods mainly follow the general pipelines of semantic segmentation.
We propose to formulate the IS task as a Gaussian process (GP)-based pixel-wise binary classification model on each image.
arXiv Detail & Related papers (2023-02-28T14:01:01Z) - A mixed-categorical correlation kernel for Gaussian process [0.0]
We present a kernel-based approach that extends continuous exponential kernels to handle mixed-categorical variables.
The proposed kernel leads to a new GP surrogate that generalizes both the continuous relaxation and the Gower distance based GP models.
arXiv Detail & Related papers (2022-11-15T16:13:04Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Learning Structures in Earth Observation Data with Gaussian Processes [67.27044745471207]
This paper reviews the main theoretical GP developments in the field.
New algorithms that respect the signal and noise characteristics, that provide feature rankings automatically, and that allow applicability of associated uncertainty intervals are discussed.
arXiv Detail & Related papers (2020-12-22T10:46:37Z) - Inter-domain Deep Gaussian Processes [45.28237107466283]
We propose an extension of inter-domain shallow GPs that combines the advantages of inter-domain and deep Gaussian processes (DGPs)
We demonstrate how to leverage existing approximate inference methods to perform simple and scalable approximate inference using inter-domain features in DGPs.
arXiv Detail & Related papers (2020-11-01T04:03:35Z) - Graph Based Gaussian Processes on Restricted Domains [13.416168979487118]
In nonparametric regression, it is common for the inputs to fall in a restricted subset of Euclidean space.
We propose a new class of Graph Laplacian based GPs (GL-GPs) which learn a covariance that respects the geometry of the input domain.
We provide substantial theoretical support for the GL-GP methodology, and illustrate performance gains in various applications.
arXiv Detail & Related papers (2020-10-14T17:01:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.