An Intuitive Tutorial to Gaussian Process Regression
- URL: http://arxiv.org/abs/2009.10862v5
- Date: Sun, 28 Jan 2024 03:24:49 GMT
- Title: An Intuitive Tutorial to Gaussian Process Regression
- Authors: Jie Wang
- Abstract summary: This tutorial aims to provide an intuitive introduction to Gaussian process regression (GPR)
GPR models have been widely used in machine learning applications due to their representation flexibility and inherent capability to quantify uncertainty over predictions.
- Score: 2.6839986755082728
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This tutorial aims to provide an intuitive introduction to Gaussian process
regression (GPR). GPR models have been widely used in machine learning
applications due to their representation flexibility and inherent capability to
quantify uncertainty over predictions. The tutorial starts with explaining the
basic concepts that a Gaussian process is built on, including multivariate
normal distribution, kernels, non-parametric models, and joint and conditional
probability. It then provides a concise description of GPR and an
implementation of a standard GPR algorithm. In addition, the tutorial reviews
packages for implementing state-of-the-art Gaussian process algorithms. This
tutorial is accessible to a broad audience, including those new to machine
learning, ensuring a clear understanding of GPR fundamentals.
Related papers
- PixelGaussian: Generalizable 3D Gaussian Reconstruction from Arbitrary Views [116.10577967146762]
PixelGaussian is an efficient framework for learning generalizable 3D Gaussian reconstruction from arbitrary views.
Our method achieves state-of-the-art performance with good generalization to various numbers of views.
arXiv Detail & Related papers (2024-10-24T17:59:58Z) - Explainable Learning with Gaussian Processes [23.796560256071473]
We take a principled approach to defining attributions under model uncertainty, extending the existing literature.
We show that although GPR is a highly flexible and non-parametric approach, we can derive interpretable, closed-form expressions for the feature attributions.
We also show that, when applicable, the exact expressions for GPR attributions are both more accurate and less computationally expensive than the approximations currently used in practice.
arXiv Detail & Related papers (2024-03-11T18:03:02Z) - Linear-scaling kernels for protein sequences and small molecules
outperform deep learning while providing uncertainty quantitation and
improved interpretability [5.623232537411766]
We develop efficient and scalable approaches for fitting GP models and fast convolution kernels.
We implement these improvements by building an open-source Python library called xGPR.
We show that xGPR generally outperforms convolutional neural networks on predicting key properties of proteins and small molecules.
arXiv Detail & Related papers (2023-02-07T07:06:02Z) - Gaussian Kernel Variance For an Adaptive Learning Method on Signals Over
Graphs [10.028519427235326]
Single- Kernel Gradraker (SKG) is an adaptive learning method predicting unknown nodal values in a network.
We focus on SKG with a Gaussian kernel and specify how to find a suitable variance for the kernel.
arXiv Detail & Related papers (2022-04-26T23:15:03Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Non-Gaussian Gaussian Processes for Few-Shot Regression [71.33730039795921]
We propose an invertible ODE-based mapping that operates on each component of the random variable vectors and shares the parameters across all of them.
NGGPs outperform the competing state-of-the-art approaches on a diversified set of benchmarks and applications.
arXiv Detail & Related papers (2021-10-26T10:45:25Z) - Deep Gaussian Processes for Few-Shot Segmentation [66.08463078545306]
Few-shot segmentation is a challenging task, requiring the extraction of a generalizable representation from only a few annotated samples.
We propose a few-shot learner formulation based on Gaussian process (GP) regression.
Our approach sets a new state-of-the-art for 5-shot segmentation, with mIoU scores of 68.1 and 49.8 on PASCAL-5i and COCO-20i, respectively.
arXiv Detail & Related papers (2021-03-30T17:56:32Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Gaussian Process Regression with Local Explanation [28.90948136731314]
We propose GPR with local explanation, which reveals the feature contributions to the prediction of each sample.
In the proposed model, both the prediction and explanation for each sample are performed using an easy-to-interpret locally linear model.
For a new test sample, the proposed model can predict the values of its target variable and weight vector, as well as their uncertainties.
arXiv Detail & Related papers (2020-07-03T13:22:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.