Robust Gaussian Process Regression Based on Iterative Trimming
- URL: http://arxiv.org/abs/2011.11057v2
- Date: Sun, 13 Jun 2021 13:49:02 GMT
- Title: Robust Gaussian Process Regression Based on Iterative Trimming
- Authors: Zhao-Zhou Li, Lu Li, Zhengyi Shao
- Abstract summary: This paper presents a new robust GP regression algorithm that iteratively trims the most extreme data points.
It can greatly improve the model accuracy for contaminated data even in the presence of extreme or abundant outliers.
As a practical example in the astrophysical study, we show that this method can precisely determine the main-sequence ridge line in the color-magnitude diagram of star clusters.
- Score: 6.912744078749024
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The Gaussian process (GP) regression can be severely biased when the data are
contaminated by outliers. This paper presents a new robust GP regression
algorithm that iteratively trims the most extreme data points. While the new
algorithm retains the attractive properties of the standard GP as a
nonparametric and flexible regression method, it can greatly improve the model
accuracy for contaminated data even in the presence of extreme or abundant
outliers. It is also easier to implement compared with previous robust GP
variants that rely on approximate inference. Applied to a wide range of
experiments with different contamination levels, the proposed method
significantly outperforms the standard GP and the popular robust GP variant
with the Student-t likelihood in most test cases. In addition, as a practical
example in the astrophysical study, we show that this method can precisely
determine the main-sequence ridge line in the color-magnitude diagram of star
clusters.
Related papers
- Gaussian Process Regression with Soft Inequality and Monotonicity Constraints [0.0]
We introduce a new GP method that enforces the physical constraints in a probabilistic manner.
This GP model is trained by the quantum-inspired Hamiltonian Monte Carlo (QHMC)
arXiv Detail & Related papers (2024-04-03T17:09:25Z) - Sparse Variational Contaminated Noise Gaussian Process Regression with Applications in Geomagnetic Perturbations Forecasting [4.675221539472143]
We propose a scalable inference algorithm for fitting sparse Gaussian process regression models with contaminated normal noise on large datasets.
We show that our approach yields shorter prediction intervals for similar coverage and accuracy when compared to an artificial dense neural network baseline.
arXiv Detail & Related papers (2024-02-27T15:08:57Z) - Model-Based Reparameterization Policy Gradient Methods: Theory and
Practical Algorithms [88.74308282658133]
Reization (RP) Policy Gradient Methods (PGMs) have been widely adopted for continuous control tasks in robotics and computer graphics.
Recent studies have revealed that, when applied to long-term reinforcement learning problems, model-based RP PGMs may experience chaotic and non-smooth optimization landscapes.
We propose a spectral normalization method to mitigate the exploding variance issue caused by long model unrolls.
arXiv Detail & Related papers (2023-10-30T18:43:21Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Scalable Gaussian-process regression and variable selection using
Vecchia approximations [3.4163060063961255]
Vecchia-based mini-batch subsampling provides unbiased gradient estimators.
We propose Vecchia-based mini-batch subsampling, which provides unbiased gradient estimators.
arXiv Detail & Related papers (2022-02-25T21:22:38Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Fast OSCAR and OWL Regression via Safe Screening Rules [97.28167655721766]
Ordered $L_1$ (OWL) regularized regression is a new regression analysis for high-dimensional sparse learning.
Proximal gradient methods are used as standard approaches to solve OWL regression.
We propose the first safe screening rule for OWL regression by exploring the order of the primal solution with the unknown order structure.
arXiv Detail & Related papers (2020-06-29T23:35:53Z) - Nonnegativity-Enforced Gaussian Process Regression [0.0]
We propose an approach to enforce the physical constraints in a probabilistic way under the GP regression framework.
This new approach reduces the variance in the resulting GP model.
arXiv Detail & Related papers (2020-04-07T00:43:46Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z) - Robust Gaussian Process Regression with a Bias Model [0.6850683267295248]
Most existing approaches replace an outlier-prone Gaussian likelihood with a non-Gaussian likelihood induced from a heavy tail distribution.
The proposed approach models an outlier as a noisy and biased observation of an unknown regression function.
Conditioned on the bias estimates, the robust GP regression can be reduced to a standard GP regression problem.
arXiv Detail & Related papers (2020-01-14T06:21:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.