Wiener Chaos in Kernel Regression: Towards Untangling Aleatoric and
Epistemic Uncertainty
- URL: http://arxiv.org/abs/2312.07387v1
- Date: Tue, 12 Dec 2023 16:02:35 GMT
- Title: Wiener Chaos in Kernel Regression: Towards Untangling Aleatoric and
Epistemic Uncertainty
- Authors: T. Faulwasser, O. Molodchyk
- Abstract summary: We consider kernel ridge regression with additive i.i.d. non-Gaussian measurement noise.
Considering a system as numerical example, we show that our approach allows to untangle the effects of Hilbert and aleatoric uncertainties.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gaussian Processes (GPs) are a versatile method that enables different
approaches towards learning for dynamics and control. Gaussianity assumptions
appear in two dimensions in GPs: The positive semi-definite kernel of the
underlying reproducing kernel Hilbert space is used to construct the
co-variance of a Gaussian distribution over functions, while measurement noise
(i.e. data corruption) is usually modeled as i.i.d. additive Gaussian. In this
note, we relax the latter Gaussianity assumption, i.e., we consider kernel
ridge regression with additive i.i.d. non-Gaussian measurement noise. To apply
the usual kernel trick, we rely on the representation of the uncertainty via
polynomial chaos expansions, which are series expansions for random variables
of finite variance introduced by Norbert Wiener. We derive and discuss the
analytic $\mathcal{L}^2$ solution to the arising Wiener kernel regression.
Considering a polynomial system as numerical example, we show that our approach
allows to untangle the effects of epistemic and aleatoric uncertainties.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization [83.06112052443233]
This paper studies kernel ridge regression in high dimensions under covariate shifts.
By a bias-variance decomposition, we theoretically demonstrate that the re-weighting strategy allows for decreasing the variance.
For bias, we analyze the regularization of the arbitrary or well-chosen scale, showing that the bias can behave very differently under different regularization scales.
arXiv Detail & Related papers (2024-06-05T12:03:27Z) - Learning Mixtures of Gaussians Using Diffusion Models [9.118706387430883]
We give a new algorithm for learning mixtures of $k$ Gaussians to TV error.
Our approach is analytic and relies on the framework of diffusion models.
arXiv Detail & Related papers (2024-04-29T17:00:20Z) - Variable Hyperparameterized Gaussian Kernel using Displaced Squeezed Vacuum State [2.1408617023874443]
A multimode coherent state can generate the Gaussian kernel with a constant value of hyper parameter.
This constant hyper parameter has limited the application of the Gaussian kernel when it is applied to complex learning problems.
We realize the variable hyper parameterized kernel with a multimode-displaced squeezed vacuum state.
arXiv Detail & Related papers (2024-03-18T08:25:56Z) - Generalization in Kernel Regression Under Realistic Assumptions [41.345620270267446]
We provide rigorous bounds for common kernels and for any amount of regularization, noise, any input dimension, and any number of samples.
Our results imply benign overfitting in high input dimensions, nearly tempered overfitting in fixed dimensions, and explicit convergence rates for regularized regression.
As a by-product, we obtain time-dependent bounds for neural networks trained in the kernel regime.
arXiv Detail & Related papers (2023-12-26T10:55:20Z) - Gaussian Process Regression under Computational and Epistemic
Misspecification [5.393695255603843]
In large data applications, computational costs can be reduced using low-rank or sparse approximations of the kernel.
This paper investigates the effect of such kernel approximations on the element error.
arXiv Detail & Related papers (2023-12-14T18:53:32Z) - Page curves and typical entanglement in linear optics [0.0]
We study entanglement within a set of squeezed modes that have been evolved by a random linear optical unitary.
We prove various results on the typicality of entanglement as measured by the R'enyi-2 entropy.
Our main make use of a symmetry property obeyed by the average and the variance of the entropy that dramatically simplifies the averaging over unitaries.
arXiv Detail & Related papers (2022-09-14T18:00:03Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - The Schr\"odinger Bridge between Gaussian Measures has a Closed Form [101.79851806388699]
We focus on the dynamic formulation of OT, also known as the Schr"odinger bridge (SB) problem.
In this paper, we provide closed-form expressions for SBs between Gaussian measures.
arXiv Detail & Related papers (2022-02-11T15:59:01Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - SLEIPNIR: Deterministic and Provably Accurate Feature Expansion for
Gaussian Process Regression with Derivatives [86.01677297601624]
We propose a novel approach for scaling GP regression with derivatives based on quadrature Fourier features.
We prove deterministic, non-asymptotic and exponentially fast decaying error bounds which apply for both the approximated kernel as well as the approximated posterior.
arXiv Detail & Related papers (2020-03-05T14:33:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.