Accounting for Input Noise in Gaussian Process Parameter Retrieval
- URL: http://arxiv.org/abs/2005.09907v1
- Date: Wed, 20 May 2020 08:23:48 GMT
- Title: Accounting for Input Noise in Gaussian Process Parameter Retrieval
- Authors: J. Emmanuel Johnson, Valero Laparra, Gustau Camps-Valls
- Abstract summary: We show how one can account for input noise estimates using a GP model formulation which propagates the error terms using the derivative of the predictive mean function.
We analyze the resulting predictive variance term and show how they more accurately represent the model error in a temperature prediction problem from infrared sounding data.
- Score: 9.563129471152058
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian processes (GPs) are a class of Kernel methods that have shown to be
very useful in geoscience and remote sensing applications for parameter
retrieval, model inversion, and emulation. They are widely used because they
are simple, flexible, and provide accurate estimates. GPs are based on a
Bayesian statistical framework which provides a posterior probability function
for each estimation. Therefore, besides the usual prediction (given in this
case by the mean function), GPs come equipped with the possibility to obtain a
predictive variance (i.e., error bars, confidence intervals) for each
prediction. Unfortunately, the GP formulation usually assumes that there is no
noise in the inputs, only in the observations. However, this is often not the
case in earth observation problems where an accurate assessment of the
measuring instrument error is typically available, and where there is huge
interest in characterizing the error propagation through the processing
pipeline. In this letter, we demonstrate how one can account for input noise
estimates using a GP model formulation which propagates the error terms using
the derivative of the predictive mean function. We analyze the resulting
predictive variance term and show how they more accurately represent the model
error in a temperature prediction problem from infrared sounding data.
Related papers
- SMURF-THP: Score Matching-based UnceRtainty quantiFication for
Transformer Hawkes Process [76.98721879039559]
We propose SMURF-THP, a score-based method for learning Transformer Hawkes process and quantifying prediction uncertainty.
Specifically, SMURF-THP learns the score function of events' arrival time based on a score-matching objective.
We conduct extensive experiments in both event type prediction and uncertainty quantification of arrival time.
arXiv Detail & Related papers (2023-10-25T03:33:45Z) - Leveraging Locality and Robustness to Achieve Massively Scalable
Gaussian Process Regression [1.3518297878940662]
We introduce a new perspective by exploring robustness properties and limiting behaviour of GP nearest-neighbour (GPnn) prediction.
As the data-size n increases, accuracy of estimated parameters and GP model assumptions become increasingly irrelevant to GPnn predictive accuracy.
We show that this source of inaccuracy can be corrected for, thereby achieving both well-calibrated uncertainty measures and accurate predictions at remarkably low computational cost.
arXiv Detail & Related papers (2023-06-26T14:32:46Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Imputation-Free Learning from Incomplete Observations [73.15386629370111]
We introduce the importance of guided gradient descent (IGSGD) method to train inference from inputs containing missing values without imputation.
We employ reinforcement learning (RL) to adjust the gradients used to train the models via back-propagation.
Our imputation-free predictions outperform the traditional two-step imputation-based predictions using state-of-the-art imputation methods.
arXiv Detail & Related papers (2021-07-05T12:44:39Z) - SLOE: A Faster Method for Statistical Inference in High-Dimensional
Logistic Regression [68.66245730450915]
We develop an improved method for debiasing predictions and estimating frequentist uncertainty for practical datasets.
Our main contribution is SLOE, an estimator of the signal strength with convergence guarantees that reduces the computation time of estimation and inference by orders of magnitude.
arXiv Detail & Related papers (2021-03-23T17:48:56Z) - A Similarity Measure of Gaussian Process Predictive Distributions [0.0]
We are interested on using a model that makes valid assumptions on the objective function whose values we are trying to predict.
We show empirical evidence in a set of synthetic and benchmark experiments that GPs predictive distributions can be compared.
This similarity metric could be extremely useful used to discard objectives in Bayesian many-objective optimization.
arXiv Detail & Related papers (2021-01-20T10:52:48Z) - Disentangling Derivatives, Uncertainty and Error in Gaussian Process
Models [12.229461458053809]
We showcase how the derivative of a GP model can be used to provide an analytical error propagation formulation.
We analyze the predictive variance and the propagated error terms in a temperature prediction problem from infrared sounding data.
arXiv Detail & Related papers (2020-12-09T10:03:13Z) - Uncertainty quantification using martingales for misspecified Gaussian
processes [52.22233158357913]
We address uncertainty quantification for Gaussian processes (GPs) under misspecified priors.
We construct a confidence sequence (CS) for the unknown function using martingale techniques.
Our CS is statistically valid and empirically outperforms standard GP methods.
arXiv Detail & Related papers (2020-06-12T17:58:59Z) - Estimation of Accurate and Calibrated Uncertainties in Deterministic
models [0.8702432681310401]
We devise a method to transform a deterministic prediction into a probabilistic one.
We show that for doing so, one has to compromise between the accuracy and the reliability (calibration) of such a model.
We show several examples both with synthetic data, where the underlying hidden noise can accurately be recovered, and with large real-world datasets.
arXiv Detail & Related papers (2020-03-11T04:02:56Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Robust Gaussian Process Regression with a Bias Model [0.6850683267295248]
Most existing approaches replace an outlier-prone Gaussian likelihood with a non-Gaussian likelihood induced from a heavy tail distribution.
The proposed approach models an outlier as a noisy and biased observation of an unknown regression function.
Conditioned on the bias estimates, the robust GP regression can be reduced to a standard GP regression problem.
arXiv Detail & Related papers (2020-01-14T06:21:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.