Sample Path Regularity of Gaussian Processes from the Covariance Kernel
- URL: http://arxiv.org/abs/2312.14886v2
- Date: Fri, 16 Feb 2024 15:17:57 GMT
- Title: Sample Path Regularity of Gaussian Processes from the Covariance Kernel
- Authors: Natha\"el Da Costa, Marvin Pf\"ortner, Lancelot Da Costa, Philipp
Hennig
- Abstract summary: Gaussian processes (GPs) are the most common formalism for defining probability distributions over spaces of functions.
We provide necessary and sufficient conditions on the covariance kernel for the sample paths of the corresponding GP to attain a given regularity.
Our results allow for novel and unusually tight characterisations of the sample path regularities of the GPs commonly used in machine learning applications.
- Score: 25.021782278452005
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian processes (GPs) are the most common formalism for defining
probability distributions over spaces of functions. While applications of GPs
are myriad, a comprehensive understanding of GP sample paths, i.e. the function
spaces over which they define a probability measure, is lacking. In practice,
GPs are not constructed through a probability measure, but instead through a
mean function and a covariance kernel. In this paper we provide necessary and
sufficient conditions on the covariance kernel for the sample paths of the
corresponding GP to attain a given regularity. We use the framework of H\"older
regularity as it grants particularly straightforward conditions, which simplify
further in the cases of stationary and isotropic GPs. We then demonstrate that
our results allow for novel and unusually tight characterisations of the sample
path regularities of the GPs commonly used in machine learning applications,
such as the Mat\'ern GPs.
Related papers
- Stationarity without mean reversion in improper Gaussian processes [6.4322891559626125]
We show that it is possible to use improper GP priors with infinite variance to define processes that are stationary but not mean reverting.
By analyzing both synthetic and real data, we demonstrate that these non-positive kernels solve some known pathologies of mean reverting GP regression.
arXiv Detail & Related papers (2023-10-04T15:11:26Z) - Non-Gaussian Process Regression [0.0]
We extend the GP framework into a new class of time-changed GPs that allow for straightforward modelling of heavy-tailed non-Gaussian behaviours.
We present Markov chain Monte Carlo inference procedures for this model and demonstrate the potential benefits.
arXiv Detail & Related papers (2022-09-07T13:08:22Z) - Shallow and Deep Nonparametric Convolutions for Gaussian Processes [0.0]
We introduce a nonparametric process convolution formulation for GPs that alleviates weaknesses by using a functional sampling approach.
We propose a composition of these nonparametric convolutions that serves as an alternative to classic deep GP models.
arXiv Detail & Related papers (2022-06-17T19:03:04Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Scaling Gaussian Process Optimization by Evaluating a Few Unique
Candidates Multiple Times [119.41129787351092]
We show that sequential black-box optimization based on GPs can be made efficient by sticking to a candidate solution for multiple evaluation steps.
We modify two well-established GP-Opt algorithms, GP-UCB and GP-EI to adapt rules from batched GP-Opt.
arXiv Detail & Related papers (2022-01-30T20:42:14Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Incremental Ensemble Gaussian Processes [53.3291389385672]
We propose an incremental ensemble (IE-) GP framework, where an EGP meta-learner employs an it ensemble of GP learners, each having a unique kernel belonging to a prescribed kernel dictionary.
With each GP expert leveraging the random feature-based approximation to perform online prediction and model update with it scalability, the EGP meta-learner capitalizes on data-adaptive weights to synthesize the per-expert predictions.
The novel IE-GP is generalized to accommodate time-varying functions by modeling structured dynamics at the EGP meta-learner and within each GP learner.
arXiv Detail & Related papers (2021-10-13T15:11:25Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z) - Uncertainty quantification using martingales for misspecified Gaussian
processes [52.22233158357913]
We address uncertainty quantification for Gaussian processes (GPs) under misspecified priors.
We construct a confidence sequence (CS) for the unknown function using martingale techniques.
Our CS is statistically valid and empirically outperforms standard GP methods.
arXiv Detail & Related papers (2020-06-12T17:58:59Z) - Sequential Gaussian Processes for Online Learning of Nonstationary
Functions [9.997259201098602]
We propose a sequential Monte Carlo algorithm to fit infinite mixtures of GPs that capture non-stationary behavior while allowing for online, distributed inference.
Our approach empirically improves performance over state-of-the-art methods for online GP estimation in the presence of non-stationarity in time-series data.
arXiv Detail & Related papers (2019-05-24T02:29:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.