Randomly Projected Additive Gaussian Processes for Regression
- URL: http://arxiv.org/abs/1912.12834v1
- Date: Mon, 30 Dec 2019 07:26:18 GMT
- Title: Randomly Projected Additive Gaussian Processes for Regression
- Authors: Ian A. Delbridge, David S. Bindel, Andrew Gordon Wilson
- Abstract summary: We use additive sums of kernels for GP regression, where each kernel operates on a different random projection of its inputs.
We prove this convergence and its rate, and propose a deterministic approach that converges more quickly than purely random projections.
- Score: 37.367935314532154
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian processes (GPs) provide flexible distributions over functions, with
inductive biases controlled by a kernel. However, in many applications Gaussian
processes can struggle with even moderate input dimensionality. Learning a low
dimensional projection can help alleviate this curse of dimensionality, but
introduces many trainable hyperparameters, which can be cumbersome, especially
in the small data regime. We use additive sums of kernels for GP regression,
where each kernel operates on a different random projection of its inputs.
Surprisingly, we find that as the number of random projections increases, the
predictive performance of this approach quickly converges to the performance of
a kernel operating on the original full dimensional inputs, over a wide range
of data sets, even if we are projecting into a single dimension. As a
consequence, many problems can remarkably be reduced to one dimensional input
spaces, without learning a transformation. We prove this convergence and its
rate, and additionally propose a deterministic approach that converges more
quickly than purely random projections. Moreover, we demonstrate our approach
can achieve faster inference and improved predictive accuracy for
high-dimensional inputs compared to kernels in the original input space.
Related papers
- Gaussian Processes Sampling with Sparse Grids under Additive Schwarz Preconditioner [6.408773096179187]
We propose a scalable algorithm for sampling random realizations of the prior and posterior of GP models.
The proposed algorithm leverages inducing points approximation with sparse grids, as well as additive Schwarz preconditioners.
arXiv Detail & Related papers (2024-08-01T00:19:36Z) - Score-based Diffusion Models in Function Space [140.792362459734]
Diffusion models have recently emerged as a powerful framework for generative modeling.
We introduce a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.
We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - FaDIn: Fast Discretized Inference for Hawkes Processes with General
Parametric Kernels [82.53569355337586]
This work offers an efficient solution to temporal point processes inference using general parametric kernels with finite support.
The method's effectiveness is evaluated by modeling the occurrence of stimuli-induced patterns from brain signals recorded with magnetoencephalography (MEG)
Results show that the proposed approach leads to an improved estimation of pattern latency than the state-of-the-art.
arXiv Detail & Related papers (2022-10-10T12:35:02Z) - B\'ezier Gaussian Processes for Tall and Wide Data [24.00638575411818]
We introduce a kernel that allows the number of summarising variables to grow exponentially with the number of input features.
We show that our kernel has close similarities to some of the most used kernels in Gaussian process regression.
arXiv Detail & Related papers (2022-09-01T10:22:14Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Factorized Gaussian Process Variational Autoencoders [6.866104126509981]
Variational autoencoders often assume isotropic Gaussian priors and mean-field posteriors, hence do not exploit structure in scenarios where we may expect similarity or consistency across latent variables.
We propose a more scalable extension of these models by leveraging the independence of the auxiliary features, which is present in many datasets.
arXiv Detail & Related papers (2020-11-14T10:24:10Z) - Sparse Spectrum Warped Input Measures for Nonstationary Kernel Learning [29.221457769884648]
We propose a general form of explicit, input-dependent, measure-valued warpings for learning nonstationary kernels.
The proposed learning algorithm warps inputs as conditional Gaussian measures that control the smoothness of a standard stationary kernel.
We demonstrate a remarkable efficiency in the number of parameters of the warping functions in learning problems with both small and large data regimes.
arXiv Detail & Related papers (2020-10-09T01:10:08Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.