Latent Map Gaussian Processes for Mixed Variable Metamodeling
- URL: http://arxiv.org/abs/2102.03935v1
- Date: Sun, 7 Feb 2021 22:21:53 GMT
- Title: Latent Map Gaussian Processes for Mixed Variable Metamodeling
- Authors: Nicholas Oune, Ramin Bostanabad
- Abstract summary: We introduce latent map Gaussian processes (LMGPs) that inherit the attractive properties of GPs but are also applicable to mixed data.
We show that LMGPs can handle variable-length inputs and provide insights into how qualitative inputs affect the response or interact with each other.
We also provide a neural network interpretation of LMGPs and study the effect of prior latent representations on their performance.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian processes (GPs) are ubiquitously used in sciences and engineering as
metamodels. Standard GPs, however, can only handle numerical or quantitative
variables. In this paper, we introduce latent map Gaussian processes (LMGPs)
that inherit the attractive properties of GPs but are also applicable to mixed
data that have both quantitative and qualitative inputs. The core idea behind
LMGPs is to learn a low-dimensional manifold where all qualitative inputs are
represented by some quantitative features. To learn this manifold, we first
assign a unique prior vector representation to each combination of qualitative
inputs. We then use a linear map to project these priors on a manifold that
characterizes the posterior representations. As the posteriors are
quantitative, they can be straightforwardly used in any standard correlation
function such as the Gaussian. Hence, the optimal map and the corresponding
manifold can be efficiently learned by maximizing the Gaussian likelihood
function. Through a wide range of analytical and real-world examples, we
demonstrate the advantages of LMGPs over state-of-the-art methods in terms of
accuracy and versatility. In particular, we show that LMGPs can handle
variable-length inputs and provide insights into how qualitative inputs affect
the response or interact with each other. We also provide a neural network
interpretation of LMGPs and study the effect of prior latent representations on
their performance.
Related papers
- Domain Invariant Learning for Gaussian Processes and Bayesian
Exploration [39.83530605880014]
We propose a domain invariant learning algorithm for Gaussian processes (DIL-GP) with a min-max optimization on the likelihood.
Numerical experiments demonstrate the superiority of DIL-GP for predictions on several synthetic and real-world datasets.
arXiv Detail & Related papers (2023-12-18T16:13:34Z) - GP+: A Python Library for Kernel-based learning via Gaussian Processes [0.0]
We introduce GP+, an open-source library for kernel-based learning via Gaussian processes (GPs)
GP+ is built on PyTorch and provides a user-friendly and object-oriented tool for probabilistic learning and inference.
arXiv Detail & Related papers (2023-12-12T19:39:40Z) - Interactive Segmentation as Gaussian Process Classification [58.44673380545409]
Click-based interactive segmentation (IS) aims to extract the target objects under user interaction.
Most of the current deep learning (DL)-based methods mainly follow the general pipelines of semantic segmentation.
We propose to formulate the IS task as a Gaussian process (GP)-based pixel-wise binary classification model on each image.
arXiv Detail & Related papers (2023-02-28T14:01:01Z) - Weighted Ensembles for Active Learning with Adaptivity [60.84896785303314]
This paper presents an ensemble of GP models with weights adapted to the labeled data collected incrementally.
Building on this novel EGP model, a suite of acquisition functions emerges based on the uncertainty and disagreement rules.
An adaptively weighted ensemble of EGP-based acquisition functions is also introduced to further robustify performance.
arXiv Detail & Related papers (2022-06-10T11:48:49Z) - A Sparse Expansion For Deep Gaussian Processes [33.29293167413832]
We propose an efficient scheme for accurate inference and efficient training based on a range of Gaussian Processes (TMGP)
Our numerical experiments on synthetic models and real datasets show the superior computational efficiency of DTMGP over existing DGP models.
arXiv Detail & Related papers (2021-12-11T00:59:33Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Graph Based Gaussian Processes on Restricted Domains [13.416168979487118]
In nonparametric regression, it is common for the inputs to fall in a restricted subset of Euclidean space.
We propose a new class of Graph Laplacian based GPs (GL-GPs) which learn a covariance that respects the geometry of the input domain.
We provide substantial theoretical support for the GL-GP methodology, and illustrate performance gains in various applications.
arXiv Detail & Related papers (2020-10-14T17:01:29Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.