A Similarity Measure of Gaussian Process Predictive Distributions
- URL: http://arxiv.org/abs/2101.08061v1
- Date: Wed, 20 Jan 2021 10:52:48 GMT
- Title: A Similarity Measure of Gaussian Process Predictive Distributions
- Authors: Lucia Asencio-Mart\'in, Eduardo C. Garrido-Merch\'an
- Abstract summary: We are interested on using a model that makes valid assumptions on the objective function whose values we are trying to predict.
We show empirical evidence in a set of synthetic and benchmark experiments that GPs predictive distributions can be compared.
This similarity metric could be extremely useful used to discard objectives in Bayesian many-objective optimization.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Some scenarios require the computation of a predictive distribution of a new
value evaluated on an objective function conditioned on previous observations.
We are interested on using a model that makes valid assumptions on the
objective function whose values we are trying to predict. Some of these
assumptions may be smoothness or stationarity. Gaussian process (GPs) are
probabilistic models that can be interpreted as flexible distributions over
functions. They encode the assumptions through covariance functions, making
hypotheses about new data through a predictive distribution by being fitted to
old observations. We can face the case where several GPs are used to model
different objective functions. GPs are non-parametric models whose complexity
is cubic on the number of observations. A measure that represents how similar
is one GP predictive distribution with respect to another would be useful to
stop using one GP when they are modelling functions of the same input space. We
are really inferring that two objective functions are correlated, so one GP is
enough to model both of them by performing a transformation of the prediction
of the other function in case of inverse correlation. We show empirical
evidence in a set of synthetic and benchmark experiments that GPs predictive
distributions can be compared and that one of them is enough to predict two
correlated functions in the same input space. This similarity metric could be
extremely useful used to discard objectives in Bayesian many-objective
optimization.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - GP+: A Python Library for Kernel-based learning via Gaussian Processes [0.0]
We introduce GP+, an open-source library for kernel-based learning via Gaussian processes (GPs)
GP+ is built on PyTorch and provides a user-friendly and object-oriented tool for probabilistic learning and inference.
arXiv Detail & Related papers (2023-12-12T19:39:40Z) - Structured Radial Basis Function Network: Modelling Diversity for
Multiple Hypotheses Prediction [51.82628081279621]
Multi-modal regression is important in forecasting nonstationary processes or with a complex mixture of distributions.
A Structured Radial Basis Function Network is presented as an ensemble of multiple hypotheses predictors for regression problems.
It is proved that this structured model can efficiently interpolate this tessellation and approximate the multiple hypotheses target distribution.
arXiv Detail & Related papers (2023-09-02T01:27:53Z) - Gaussian Process Probes (GPP) for Uncertainty-Aware Probing [61.91898698128994]
We introduce a unified and simple framework for probing and measuring uncertainty about concepts represented by models.
Our experiments show it can (1) probe a model's representations of concepts even with a very small number of examples, (2) accurately measure both epistemic uncertainty (how confident the probe is) and aleatory uncertainty (how fuzzy the concepts are to the model), and (3) detect out of distribution data using those uncertainty measures as well as classic methods do.
arXiv Detail & Related papers (2023-05-29T17:00:16Z) - Scalable mixed-domain Gaussian process modeling and model reduction for longitudinal data [5.00301731167245]
We derive a basis function approximation scheme for mixed-domain covariance functions.
We show that we can approximate the exact GP model accurately in a fraction of the runtime.
We also demonstrate a scalable model reduction workflow for obtaining smaller and more interpretable models.
arXiv Detail & Related papers (2021-11-03T04:47:37Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - PSD Representations for Effective Probability Models [117.35298398434628]
We show that a recently proposed class of positive semi-definite (PSD) models for non-negative functions is particularly suited to this end.
We characterize both approximation and generalization capabilities of PSD models, showing that they enjoy strong theoretical guarantees.
Our results open the way to applications of PSD models to density estimation, decision theory and inference.
arXiv Detail & Related papers (2021-06-30T15:13:39Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Gaussian Process Regression with Local Explanation [28.90948136731314]
We propose GPR with local explanation, which reveals the feature contributions to the prediction of each sample.
In the proposed model, both the prediction and explanation for each sample are performed using an easy-to-interpret locally linear model.
For a new test sample, the proposed model can predict the values of its target variable and weight vector, as well as their uncertainties.
arXiv Detail & Related papers (2020-07-03T13:22:24Z) - Skew Gaussian Processes for Classification [0.225596179391365]
We propose Skew-Gaussian processes (SkewGPs) as a non-parametric prior over functions.
SkewGPs inherit all good properties of GPs and increase their flexibility by allowing asymmetry in the probabilistic model.
arXiv Detail & Related papers (2020-05-26T19:13:03Z) - Accounting for Input Noise in Gaussian Process Parameter Retrieval [9.563129471152058]
We show how one can account for input noise estimates using a GP model formulation which propagates the error terms using the derivative of the predictive mean function.
We analyze the resulting predictive variance term and show how they more accurately represent the model error in a temperature prediction problem from infrared sounding data.
arXiv Detail & Related papers (2020-05-20T08:23:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.