Pointwise uncertainty quantification for sparse variational Gaussian
process regression with a Brownian motion prior
- URL: http://arxiv.org/abs/2310.00097v3
- Date: Tue, 31 Oct 2023 16:54:41 GMT
- Title: Pointwise uncertainty quantification for sparse variational Gaussian
process regression with a Brownian motion prior
- Authors: Luke Travis, Kolyan Ray
- Abstract summary: We study pointwise estimation and uncertainty quantification for a sparse variational Gaussian process with eigenvector inducing variables.
For sufficiently many inducing variables, we precisely characterize the frequentist coverage, deducing when credible sets from this variational method are conservative and when overconfident/misleading.
- Score: 3.9886149789339336
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study pointwise estimation and uncertainty quantification for a sparse
variational Gaussian process method with eigenvector inducing variables. For a
rescaled Brownian motion prior, we derive theoretical guarantees and
limitations for the frequentist size and coverage of pointwise credible sets.
For sufficiently many inducing variables, we precisely characterize the
asymptotic frequentist coverage, deducing when credible sets from this
variational method are conservative and when overconfident/misleading. We
numerically illustrate the applicability of our results and discuss connections
with other common Gaussian process priors.
Related papers
- SoftCVI: contrastive variational inference with self-generated soft labels [2.5398014196797614]
We introduce Soft Contrastive Variational Inference (SoftCVI), which allows a family of variational objectives to be derived through a contrastive estimation framework.
SoftCVI objectives often outperform other commonly used variational objectives.
arXiv Detail & Related papers (2024-07-22T14:54:12Z) - Likelihood Ratio Confidence Sets for Sequential Decision Making [51.66638486226482]
We revisit the likelihood-based inference principle and propose to use likelihood ratios to construct valid confidence sequences.
Our method is especially suitable for problems with well-specified likelihoods.
We show how to provably choose the best sequence of estimators and shed light on connections to online convex optimization.
arXiv Detail & Related papers (2023-11-08T00:10:21Z) - Robust scalable initialization for Bayesian variational inference with
multi-modal Laplace approximations [0.0]
Variational mixtures with full-covariance structures suffer from a quadratic growth due to variational parameters with the number of parameters.
We propose a method for constructing an initial Gaussian model approximation that can be used to warm-start variational inference.
arXiv Detail & Related papers (2023-07-12T19:30:04Z) - Provable convergence guarantees for black-box variational inference [19.421222110188605]
Black-box variational inference is widely used in situations where there is no proof that its optimization succeeds.
We provide rigorous guarantees that methods similar to those used in practice converge on realistic inference problems.
arXiv Detail & Related papers (2023-06-04T11:31:41Z) - Sharp Calibrated Gaussian Processes [58.94710279601622]
State-of-the-art approaches for designing calibrated models rely on inflating the Gaussian process posterior variance.
We present a calibration approach that generates predictive quantiles using a computation inspired by the vanilla Gaussian process posterior variance.
Our approach is shown to yield a calibrated model under reasonable assumptions.
arXiv Detail & Related papers (2023-02-23T12:17:36Z) - Instance-Dependent Generalization Bounds via Optimal Transport [51.71650746285469]
Existing generalization bounds fail to explain crucial factors that drive the generalization of modern neural networks.
We derive instance-dependent generalization bounds that depend on the local Lipschitz regularity of the learned prediction function in the data space.
We empirically analyze our generalization bounds for neural networks, showing that the bound values are meaningful and capture the effect of popular regularization methods during training.
arXiv Detail & Related papers (2022-11-02T16:39:42Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Sigma-Delta and Distributed Noise-Shaping Quantization Methods for
Random Fourier Features [73.25551965751603]
We prove that our quantized RFFs allow a high accuracy approximation of the underlying kernels.
We show that the quantized RFFs can be further compressed, yielding an excellent trade-off between memory use and accuracy.
We empirically show by testing the performance of our methods on several machine learning tasks that our method compares favorably to other state of the art quantization methods in this context.
arXiv Detail & Related papers (2021-06-04T17:24:47Z) - Moment-Based Variational Inference for Stochastic Differential Equations [31.494103873662343]
We construct the variational process as a controlled version of the prior process.
We approximate the posterior by a set of moment functions.
In combination with moment closure, the smoothing problem is reduced to a deterministic optimal control problem.
arXiv Detail & Related papers (2021-03-01T13:20:38Z) - The Variational Method of Moments [65.91730154730905]
conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables.
Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem.
We provide algorithms for valid statistical inference based on the same kind of variational reformulations.
arXiv Detail & Related papers (2020-12-17T07:21:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.