Non-Gaussian Process Regression
- URL: http://arxiv.org/abs/2209.03117v1
- Date: Wed, 7 Sep 2022 13:08:22 GMT
- Title: Non-Gaussian Process Regression
- Authors: Yaman K{\i}ndap and Simon Godsill
- Abstract summary: We extend the GP framework into a new class of time-changed GPs that allow for straightforward modelling of heavy-tailed non-Gaussian behaviours.
We present Markov chain Monte Carlo inference procedures for this model and demonstrate the potential benefits.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Standard GPs offer a flexible modelling tool for well-behaved processes.
However, deviations from Gaussianity are expected to appear in real world
datasets, with structural outliers and shocks routinely observed. In these
cases GPs can fail to model uncertainty adequately and may over-smooth
inferences. Here we extend the GP framework into a new class of time-changed
GPs that allow for straightforward modelling of heavy-tailed non-Gaussian
behaviours, while retaining a tractable conditional GP structure through an
infinite mixture of non-homogeneous GPs representation. The conditional GP
structure is obtained by conditioning the observations on a latent transformed
input space and the random evolution of the latent transformation is modelled
using a L\'{e}vy process which allows Bayesian inference in both the posterior
predictive density and the latent transformation function. We present Markov
chain Monte Carlo inference procedures for this model and demonstrate the
potential benefits compared to a standard GP.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Gaussian Process Regression with Soft Inequality and Monotonicity Constraints [0.0]
We introduce a new GP method that enforces the physical constraints in a probabilistic manner.
This GP model is trained by the quantum-inspired Hamiltonian Monte Carlo (QHMC)
arXiv Detail & Related papers (2024-04-03T17:09:25Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - On the Double Descent of Random Features Models Trained with SGD [78.0918823643911]
We study properties of random features (RF) regression in high dimensions optimized by gradient descent (SGD)
We derive precise non-asymptotic error bounds of RF regression under both constant and adaptive step-size SGD setting.
We observe the double descent phenomenon both theoretically and empirically.
arXiv Detail & Related papers (2021-10-13T17:47:39Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - On MCMC for variationally sparse Gaussian processes: A pseudo-marginal
approach [0.76146285961466]
Gaussian processes (GPs) are frequently used in machine learning and statistics to construct powerful models.
We propose a pseudo-marginal (PM) scheme that offers exact inference as well as computational gains through doubly estimators for the likelihood and large datasets.
arXiv Detail & Related papers (2021-03-04T20:48:29Z) - Probabilistic Numeric Convolutional Neural Networks [80.42120128330411]
Continuous input signals like images and time series that are irregularly sampled or have missing values are challenging for existing deep learning methods.
We propose Probabilistic Convolutional Neural Networks which represent features as Gaussian processes (GPs)
We then define a convolutional layer as the evolution of a PDE defined on this GP, followed by a nonlinearity.
In experiments we show that our approach yields a $3times$ reduction of error from the previous state of the art on the SuperPixel-MNIST dataset and competitive performance on the medical time2012 dataset PhysioNet.
arXiv Detail & Related papers (2020-10-21T10:08:21Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z) - Nonnegativity-Enforced Gaussian Process Regression [0.0]
We propose an approach to enforce the physical constraints in a probabilistic way under the GP regression framework.
This new approach reduces the variance in the resulting GP model.
arXiv Detail & Related papers (2020-04-07T00:43:46Z) - Transport Gaussian Processes for Regression [0.22843885788439797]
We propose a methodology to construct processes, which include GPs, warped GPs, Student-t processes and several others.
Our approach is inspired by layers-based models, where each proposed layer changes a specific property over the generated process.
We validate the proposed model through experiments with real-world data.
arXiv Detail & Related papers (2020-01-30T17:44:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.